You are on page 1of 262

About the Authors

Daniel Sloan is an internationally recognized Six Russell Boyles earned his PhD in Statistics at the
Sigma Master Black Belt and an ASQ certified Black University of California, Davis. He subsequently
Belt. His 16 years of experience have been spent two years in the Applied Mathematics Group
distinguished by Six Sigma seminars in Mexico, at Lawrence Livermore National Laboratory, two
Uruguay, Brazil, Australia, and 47 of the United States. years as Director of Statistical Analysis for NERCO
McGraw Hill and Quality Press published five of his Minerals Company, and eight years as Statistical
7 books. As a Senior Vice President of Applied Process Control Manager at Precision Castparts
Business Science for a $500 million company, he led Corporation. As a trainer and a consultant, Russell
their Six Sigma initiative. With "factory floor" Six specializes in Six Sigma Master Black Belt and
Sigma successes ranging from non-woven fabrics, Black Belt certification courses, Design of
extruded products, medical equipment, aerospace Experiments, Gage Studies, Reliability and
engineering, automotive parts, to Internet router Statistical Process Control. A few of his recent
production and health care, Daniel has a proven track papers have appeared in ASQ publications
record in helping companies produce bottom line results. Technometrics and Journal of Quality Technology.

Evidence-based Decision Services and Products
We are the first and best provider of evidence-based decision services in the world.
We help clients rapidly use the evidence in their raw data to dramatically improve bottom line business results.

Six Sigma Services Evidence-based Decision Support

Master Black Belt, Black Belt, Green Belt, Data mining, strategic Information Systems design.
Champion, and Senior Executive certification
training for all industries including manufacturing, Bottom-line business results project coaching.
financial services, and health care.
Consulting support to private industry, government
Consortium Six Sigma events for small companies and academic institutions that are implementing
who wish to pool resources. evidence-based decision systems.

Custom designed training events and multi-media, Custom designed training events and multi-media,
evidence-based Six Sigma materials. evidence-based education and training materials.

For more information visit or call:

Sloan Consulting, Seattle, WA (206)-525-7858

M. Daniel Sloan, author and owner of the copyright for this work, has
licensed it under the Creative Commons Attribution Non-Commercial Non-
Derivative (by-nc-nd) License. http://www.danielsloan.com is the legal, file
download location. To view this license visit:

http://creativecommons.org/licenses/by-nc-nd/2.5/
http://creativecommons.org/licenses/by-nc-nd/2.5/legalcode

Or send a letter to:

Corporate Headquarters
Creative Commons
543 Howard Street
5th Floor
San Francisco, CA 94105-3013
United States

Profit Signals
How Evidence-based Decisions
Power Six Sigma Breakthroughs

By

M. Daniel Sloan and Russell A. Boyles, PhD

Sloan Consulting, LLC
Seattle, Washington

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Profit Signals, How Evidence-Based Decisions Power Six Sigma Breakthroughs.
M. Daniel Sloan and Russell A. Boyles

Library of Congress, Cataloging-in-Publication Data
Sloan, M. Daniel, 1950-
Boyles, Russell A., 1951-
Profit signals how evidence-based decisions power six sigma breakthroughs / M. Daniel
Sloan and Russell A. Boyles
Included bibliographical references and index.
1. Six Sigma—quality control—Statistical Models. 2. Medical care—quality assurance—
Statistical Models. 3 Cost Control—Statistical Models—Mathematical Models.

© 2003 by Evidence-Based Decisions, Inc. http://www.evidence-based-decisions.com

All rights reserved. No part of this book may be reproduced in any form or by any means,
electronic, mechanical; photocopying, recording, or otherwise, without the prior written
permission of the publisher. Your support of author’s rights is appreciated. For permissions the
authors can be contacted directly.

Sloan Consulting http://www.danielsloan.com/
206-525-7858
10035 46th AVE NE, Seattle WA 98125

Trademark Acknowledgements

Profit Signals® and the phrase “Vector Analysis Applied to a Data Matrix®” and the Profit
Signals tetrahedron on the book’s cover are registered trademarks of Sloan Consulting. LLC,
Six Sigma® is a registered trademark and service mark of Motorola, Incorporated. Sculpey
Clay® is a registered trademark of Polyform Products Co. Excel® is a registered trademark of
Microsoft. Other copyright notices are listed in the production notes at the end of the book.

Illustrations: Cover, Robin Hing. Tables and illustrations, Robin Hing, Russell A. Boyles, M.
Daniel Sloan, John Pendleton, Austin Sloan, and Alan Tomko. Netter illustrations used with
permission from Icon Learning Systems, a division of MediMedia USA, Inc. All rights reserved.

The book’s design and layout, using Adobe InDesign 2.0.2, were completed by M. Daniel
Sloan. Printed in the United States of America.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Many of the most useful designs are extremely simple.
Ronald Alymer Fisher

How much variation should we leave to chance?
Walter A. Shewhart

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

© M. 2003 . All Rights Reserved. Daniel Sloan and Russell A. Boyles.

Table of Contents Premise ............................ 44 The New Management Equation ............................................................................ 37 Data Recycling......................................... 28 Business Art and Science ................. 9 The Parable of the Paper Bags ............................................................................................24 Chapter 1—The Five-Minute PhD .................. 47 Chapter 2—Standards of Evidence .................................................. 51 Cost Accounting Variance Analysis ................................ 55 Delusions and Bamboozles ....................................... 2003 .. 79 © M....................................................... 30 Profit Signals ............ 74 Spreadsheet versus Data Matrix.......................... 56 Vector Analysis 101 ....................................................................... 57 Degrees of Freedom.. 14 The Dollar Value of Evidence................ 46 Endnotes ................................. Boyles. 53 Accounting versus Science ................................................. 44 Closing Arguments . 20 Endnotes .................. 50 “Scientific” Management... All Rights Reserved................................. 16 Six Sigma ...... 18 How to Read This Book ........................... 65 Bar Chart Bamboozles ............. 67 The Game is Afoot.................................................................................. 43 The Full Circle of Data Discovery........................ 27 Start Your Stopwatch Now ........................ Daniel Sloan and Russell A........................ 49 Poetry versus Science................................................................

............ Improve...... 87 Six Sigma (6σ) Basics .................. 163 Comparing Three Ways of Doing Things ....................156 Overcoming Obstacles .................. Boyles................175 Endnotes ..........................175 © M... 104 Process Maps .................. 162 Comparing Two Ways of Doing Things... 84 Chapter 3—Evidence-based Six Sigma .................... 2003 ............. Daniel Sloan and Russell A......117 Customer Service – Governmental Agency.................................. 89 The Six Sigma Profit Strategy .149 Chapter 5—Using Profit Signals ............ 96 Lucrative Project Selection ............................. Measure................................................174 Closing Arguments ................................................................ 135 The Daily Grind .......119 Days in Accounts Receivable ................... 90 Lucrative Project Results Map ....................... 98 Financial Modeling and Simulation ...............................151 A Better Way to Look At Numbers .............................................................153 Testing the Current Way of Doing Things ...94 Define.......................................................................... Analyze............110 Process Capability ..113 Endnotes ........................... 142 “Die Tuning” for Vinyl Extrusion ........ Confidence Levels and Standards of Evidence .......172 Chapter Homework... Control ................................115 Chapter 4—Case Studies .........................145 Endnotes .......................................................................................... 81 Closing Arguments .. 122 Breaking the Time Barrier .............................................. All Rights Reserved. 83 Endnotes ....... Profit Signals........vi Table of Contents P-values...........................................169 Comparing 256 Ways of Doing Things......................152 Corrugated Copters....167 Comparing Eight Ways of Doing Things ........................................ 106 The Costs of Poor Quality ................................ 100 Compare and Contrast Analysis ...... 128 “Beating Heart” Bypass Grafts ...................................

............178 Three Wishes ...191 Chapter 7—Sustaining Results ............ Six Sigma Black Belt/ Expert 16 Class Curriculum Outline ...........................217 Six Sigma’s Hidden Factory .................................................................................................................................... 177 Fingerprint Evidence ............................................................ 227 III........................... 188 Closing Arguments ................................................. Evidence-Based Decisions............................................................................... Inc.....193 Evaluating Practices and Profits........ Profit Signals Production Notes ................................... 222 Endnotes ....................................................... All Rights Reserved................. 252 Index .................... Boyles.................. 224 Appendices ................... The Business Bookshelf .............................................. Table of Contents vii Chapter 6—Predicting Profits ......... Glossary of Terms: Data Matrix..................................................214 Endnotes ........ 194 Process Improvement Simulation............. 225 I.............218 Our Proposal...211 Closing Arguments .... 205 Taking Action ............................... 2003 ... 225 II..........................................................................255 © M........214 Chapter 8 —The Three Rs ............................. 199 Monitoring Practices and Profits ............. 231 IV.. Vector Analysis And Evidence-based Decisions .191 Endnotes ....183 Predicting Real Flight Times... Daniel Sloan and Russell A......179 Prediction Practice ........................

2003 . Daniel Sloan and Russell A. Boyles. All Rights Reserved.viii Table of Contents © M.

broad assertion with no basis in fact. Thus one word has two. The laws of Variation. the correct analysis has consisted of a vector analysis applied to a data matrix. Boyles. opposite meanings. Face value judgments. This book will show you how to turn measurements into evidence and evidence into profit. Vector analysis is a must-have. Measurements are. The laws of motion are a Generalization. Gravity is a physical constant of our universe. Sir Ronald Fisher first explained it at the beginning of the 1920s. Every person in every organization can use this tool to make more money. a Generalization delivers valid conclusions and accurate predictions. Vector analysis is a vast. With this book. and superstitions are not pathways to evidence. fundamental job skill. All Rights Reserved. more profitable business decisions. opinions. circumstance. a Generalization is a verifiable law of the universe. we aim to transform the arcane mysteries of vector analysis into common knowledge. 2003 . The laws of gravity are a Generalization. suspicions. Daniel Sloan and Russell A. © M.1 Evidence is the foundation of Profit Signals and vector analysis is the foundation for evidence.2 Unlike a generalization. The word ‘generalization’ usually denotes a thoughtless. By contrast. the chance fluctuations or Noise that attend every measurement.Premise P rofit Signals is a guide for using evidence to make better. Measurements become evidence when they are analyzed correctly. Since 1920. audacious and empirically true Generalization. are a Generalization. gut feelings. Very few people know this.

more profitable business decisions are (1) identifying and (2) interpreting the profit signals in your raw data. 2) a Profit Signal. The vector analysis equation is much easier to understand when it is presented as a picture. The cornerstone of evidence is even easier to grasp in its physical form. The keys to making better.® The construction process is fun and informative. We call this stable geometric figure “the cornerstone of evidence. observation. and analysis. Evidence-based decisions focus on the three vectors on the right side of the following vector analysis equation: Raw Data = Data Average + Profit Signal + Noise. Knowing how to find and © M. Boyles. An evidence-based decision evaluates three key vectors: 1) a Data Average. All Rights Reserved. The six edges of the tetrahedron shown in Figure 1 represent the six different ways of combining the three vectors on the right side of the equation. In Profit Signals you will learn how to build one with bamboo skewers and Sculpey Clay. Profit signals are the most important element in any data-driven business decision. A vector is best visualized as an arrow connecting one point in space to another.10 Premise They are Law. and 3) Noise. Just as you can measure gravity (Newton and the apple). Generalizations like statistical variation can be tested and validated through a process of experimentation. Vector analysis is the only way to identify profit signals.” Figure 1 A vector analysis requires a minimum of three Generalized dimensions. A vector defines magnitude and direction. Daniel Sloan and Russell A. 2003 . A vector is a set of numbers that is treated as a single entity.

Figure 2 illustrates traditional break-even point analysis.” A break-even point. That was the year G. Profit Signals fills in those educational gaps. They are known today as cost- accounting variance analysis.6 Figure 2 The break-even school of thought was founded by G. It is based on the difference between the two lines—the differences between average expense and average income at various volumes. Daniel Sloan and Russell A. 2003 . ������ (Averaged dimensional. Relatively few people are aware of vector analysis and its universal relevance. 3. Boyles. Charter Harrison. “cost-accounting variance analysis” is inherently one-dimensional. All Rights Reserved. published “Principles of a Cost System Based on Standards” in Industrial Engineering magazine. 5 Since then Harrison’s accounting principles and procedures have become universally accepted. The lines cross at the “break-even point. Dollars Expenses) ���� ��������������� �� ����������������� ��� ged �� era e) v (A com ���������������� In Product or Service Volume For example. Waterhouse & Company. 4. a London accountant employed by Price. You will soon know why they are invaluable. It is inherently one. The break-even school of thought has dominated business decisions since 1918. these differences are called predicted values. Technically speaking. With every chapter you will understand more clearly what profit signals are. Charter ������� Harrison in 1918. Many do not understand the fundamental difference between a data matrix and a spreadsheet. As © M. Premise 11 graph your profit signals are extraordinary money-making skills. It assumes that average expenses and average income are perfect linear functions of volume.

This F ratio. For example. 2003 . Boyles. It is but one of the six vectors required for a complete. Figure 3 The differences between average income and average expense form one vector in the set of six required for a complete vector analysis. a cost accounting variance analysis can “…decompose the total difference between planned and actual performance into elements that can be assigned to individual responsibility centers. predicted values are used in isolation. these differences form the vector at the back of the tetrahedron. as it is called. Unfortunately. Establishing performance standards and evaluating actual results in relation to them were important steps forward. In break-even analysis. three-dimensional vector analysis.12 Premise shown in Figure 3. In other accounting cases. compares the length of the profit signal vector to the length of the noise vector. Daniel Sloan and Russell A. Equally important. the strength of evidence supporting any conclusion depends on a ratio involving the profit signal and noise vectors. Modern textbooks teach cost-accounting variance analysis as a way to identify causes of profits and losses. cost-accounting variance analysis evolved as a collection of one-dimensional methods.”8 © M. the analysis is based solely on the raw data. there was no cross-pollination between Harrison’s work in London and Sir Ronald Fisher’s simultaneous 1918 development of vector analysis in rural England.7 According to one Top-10 business-school accounting text. Because the methods of cost-accounting variance analysis are inherently one-dimensional. All Rights Reserved. it is impossible for any of them to produce a correct analysis. Instead. the accuracy of any predicted value depends on the length of the noise vector.

you probably learned about this idea in your favorite high school class. Transparency is indispensable to evidence-based decisions. In other words. we had to wade through volumes of bewildering algebra to analyze data. Cost accounting variance analysis lacks transparency. (If you now have a frown on your face. 2003 . It is a set of one- dimensional analysis methods incapable of distinguishing profit signals from noise. Boyles. geometry. With the New Management Equation we determine the variance and there the analysis begins. c2 = a2 + b2 . We trust you will too. The president of a $500 million company put it this way: “In old-school cost accounting we determine the variance and there the analysis stops. It creates an outward appearance of propriety while it conceals covert improprieties. It is a desirable accounting quality. Cost-accounting variance analysis is just arithmetic. Transparency implies the full disclosure of all elements. Few of us ever saw the cornerstone of evidence. Like the whalebone corsets of that era. We call it the New Management Equation. © M. In the past. it is a constricting artifact. Few of us were able to take evidence to our bottom line: “How can I personally use vector analysis to solve my problems and make my business more profitable?” The one formula we will use in this book is the Pythagorean Theorem. It is not. 21st Century teaching methods and computer graphics place vector analysis in its rightful position in business decision- making.” This vector analysis is represented by the forward right triangle in Figure 4 . Premise 13 This sounds like a vector analysis. The vast majority of evidence- based decisions are based on this simple formula.) This equation defines the right triangles comprising the cornerstone of evidence: The square of the long side of a right triangle equals the sum of the squares of the other two sides. Vector analysis is transparent. Daniel Sloan and Russell A. It suppresses five-sixths—83 percent—of the accounting and analysis information that is contained in raw data. All Rights Reserved. It hasn’t changed one whit since the day it was born during the 20th Century’s “scientific management” craze.

The variations vector is then broken up into two components: 1) the Profit Signal vector and 2) the Noise vector. All Rights Reserved. Then we will ask that you vote in favor of full disclosure and transparency. This process reminds me of the way my grandfather ran his business. A vector analysis begins with variations around the data average. Daniel Sloan and Russell A. The differences between a vector analysis applied to a data matrix and a spreadsheet analysis applied to arbitrary clusters of numbers are irreconcilable. One of our novice students shared a personal story on a first day of training. Competitors. the more you will understand how using just one of six possible vectors can misrepresent evidence and damage profitability. all of us face the same challenges when we tackle the Six Sigma body of knowledge for evidence-based decisions. Boyles. “I loved my grandparents very much. 2003 . Our indictment is harsh. are human. “What you are teaching us is a new skill that is hard to grasp. We will help you challenge it. We use her parable of the paper bags in our Six Sigma decision courses. The more you know about the cornerstone of evidence.14 Premise Figure 4 Cost accounting variance analysis ends with variations from standard values. Therefore. The Parable of the Paper Bags We wrote Profit Signals for business leaders who are resolute competitors. and we both belong in this class. I was very close to © M.

he and his wife couldn’t earn a living. He had to leave school in the second grade to go work on a tobacco farm. the testimony of Arthur Andersen. 2003 . what he sacrificed. They sold produce. He split cedar shakes in his spare time to supplement the family income. They had 10 acres of raspberries and a five-acre garden. Then he would add up columns of numbers so he would know what to charge people. “There were all kinds of transactions. Washington. Rite © M. Premise 15 them. I think it was in 1959.” There is no doubt about it. my grandfather didn’t know how to multiply. he kept all his receipts in different brown paper bags. By multiplying he wouldn’t have to spend so much time with his paper bags. North Carolina in 1893. “He originally worked for the Sauk Logging Company. Enron. Coca Cola. he preferred to work for himself. He listened and he learned how to do a few simple problems correctly. There were milk cows and always some beef. but gosh. “His system worked. “I learned how to multiply in third grade. But. So they packed up their car and traveled across country to Darrington. All Rights Reserved. Once a month he would arrange these bags in the living room. The break-even thinking of cost-accounting variance analysis works. I came home and proudly told him I could teach him to multiply. My grandfather was born in Sylva. “One day. I have always found math to be difficult. Daniel Sloan and Russell A. I have to work at it and I really would rather do something that comes easy. Cendant. Instead. During the Great Depression. When the Federal government bought his parcel of land in North Carolina for an addition to Smokey Mountain National Park. there were 13 of them altogether. I got pretty good at it by the fourth grade. He couldn’t bring himself to believe in this new-fangled way of doing things. After he picked it up he went right back to using his paper bags. With his brother’s family. He never trusted multiplication. Grandpa bought and sold heifers. I still do. But. But. Boyles. “My grandparents planted 80 cherry trees. he took the money and bought property here in Arlington.

They pore over monthly spreadsheet reports. decade after decade. and other companies of former greatness. They watch production. bargaining. and depression. As you and your colleagues work to master evidence-based decisions. Though these ideas do not intimidate children. At the end. and as consultants to senior executives. do not be surprised if you observe anger. Boyles. and the vector analysis mind set. Usually. As senior executives. The Dollar Value of Evidence The quality of a manager’s decisions and consequent actions determine profit and loss. With the pressure to produce © M. they can threaten adults. we hope you will arrive at acceptance. and profitable way of doing work. We ask you to critically evaluate this proposal. Negotiate and get to “Yes” with your peers. efficient. Since large sums of money are often at risk. denial. Daniel Sloan and Russell A. 2003 . Vector analysis theory and tools are to multiplication as multiplication is to addition. averages. year after year. suggests that the way it works is costly. differences. They are valuable time savers.9 That way of doing work is a vector analysis applied to a data matrix. bar graphs and pie charts. This cycle accompanies any and every substantive life change. Nevertheless. Power and beauty are their strengths. but also their Achilles heel. WorldCom. All Rights Reserved. effective. non- scientific variety to claim that vector analysis is the solution to problems of this magnitude.16 Premise Aid. It is further obscured by arithmetic totals. Anticipate this roller coaster. It would be a generalization of the non-mathematical. are in fact important parts of business decision solutions. managers weigh their “evidence”. Get to yes with your executives and those you lead so that your company is not using brown paper bags to compete against a more powerful. we propose that vector analysis. we have seen this process repeated month after month. the most valuable information a manager has remains buried in the spreadsheets.

these questions are accompanied by unsettling feelings and thoughts that nurture anxiety: a) I am comfortable with the way things are. We must handle fear. in every case where our students have used the information we present. Because we are all human. office location and furnishings. Boyles. Premise 17 profits. clothing. it is easy to understand why many managers resort to an expedient device: appearances. and 44 of the United States. Some were based on evidence. Some of those decisions were good. b) This new knowledge puts my previous decisions in a bad light. in Australia. All Rights Reserved. and how strong is it? 3. Evidence-based decisions call for a higher standard. others were not. Brazil. What actions should I take based on this evidence? 4. We know this from serving individual and corporate customers in virtually every industry. What evidence will confirm that management actions produced the desired results? Because we are only human. social networks. England. Singapore. Uruguay. Some were bad. Should I believe these numbers? 2. automobiles. evidence and ROI may not be enough. What is the evidence in these numbers. certain questions ought to begin to perk:10 1. ROI is at least 10:1. Whenever a manager looks at the numbers used to measure the performance of an organization. New Zealand. In most cases. It may help you to know that. A more typical result is 50:1. 2003 . c) I don’t want to lose my job. We ask you to contrast the profit related to good decisions with the loss related to bad ones. © M. You are probably reading this book because you have made business decisions. The difference between these two numbers forecasts the initial Return on Investment (ROI) you can expect from reading this book. bottom line business results like these eventually break through the barriers of fear. The privileges of position— title. Daniel Sloan and Russell A. Mexico. and financial reward—can and often do persuade others that appearance is evidence.

18 Premise Our clients welcome the opportunity to improve on their current methods of analysis and decision-making. We. two-.11 The process of making an evidence-based decision is elegantly simple. Six Sigma made Profit Signals possible. profitable solutions to even the most complex. yet it is profound.and n-dimensional profit signals waiting to be discovered in your raw data can provide practical. conceived Six Sigma in 1986.12 Bill Smith. an engineer at Motorola. and other experienced professionals in the field. more dramatic breakthroughs. rapid. competition are forcing all of us to improve. also have learned which parts are extraneous. Experience. confounding and challenging business problems you face. Two fundamental Six Sigma concepts. have never been explained to anyone’s satisfaction in any previous publication. Six Sigma has been based on two seemingly reasonable assumptions: © M. Six Sigma In today’s popular press. Nevertheless. the greater our aversion. This is natural and good. We return the favor by showing how to flex the evidence muscle without carrying the weight of bureaucracy. The demand for additional. Each breakthrough spurs demand for further. Six Sigma companies have made a conscious decision to conquer their reluctance. Daniel Sloan and Russell A. evidence and most of all. 2003 . It is not new. the data matrix and vector analysis. Boyles. dramatic breakthroughs can be satisfied only if we trim fat from Six Sigma’s middle-aged spread. It has demonstrated its ability to improve productivity and profitability in every industry. Since 1986. we all have a natural aversion to change. The greater the change is. The iterative nature of the Six Sigma project cycle has taught us which parts of Six Sigma are essential. three. The one. evidence-based decisions are known as Six Sigma (6σ). This major step forward has produced trillions of dollars in profit. All Rights Reserved.

Six Sigma tools are too difficult for most people to use. © M. If you expect your business to meet these objectives. 2. the New Management Equation ( c2 = a2 + b2) has helped people make money from measurements. It is well beyond any shadow of doubt. Six Sigma is a very classy way to earn them. and with the support of senior management leadership. Premise 19 1. Anyone can master what is called the Black Belt Body of Knowledge (BOK). For over 2. You will immediately be able to use what you learn to make evidence- based decisions. its products and services must exceed the great expectations of fickle customers. c) The ability to operate carefully chosen statistical software.000 years. It is impossible to teach Six Sigma theory to everyone in a company. Goods and services must be able to withstand the scrutiny of the free press. 2003 . Boyles. Personal computers and software have changed the world. If your enterprise is to succeed. Improved decisions can lead you and your company to Six Sigma profits. The underlying principles of a data matrix and vector analysis are timeless style. We have discovered these assumptions are no longer valid. The supporting evidence for this claim is overwhelming. Based on our experience. Anyone and everyone can learn this unifying theory. Daniel Sloan and Russell A. Today’s requirements for Six Sigma leadership are simply these: a) A passionate aptitude for pursuing the truth in a system. You will learn fundamentals quickly. Profits are always in fashion. b) An understanding of the nature of a physical law or Generalization. All Rights Reserved. your enterprise must embrace and leverage the power of evidence-based decisions. They can learn it quickly. this process can be accomplished in 10 to 16 days. This is the path we take and the case we make in Profit Signals. and even an investigative Senate sub-committee.

magnetic information media. vector analysis. and the New Management Equation. beer. service. global navigation systems. complete the suggested experiments as you go. Olympic gold medal speed skating blades.13 We welcome you to the world of the data matrix. It takes only five minutes. Any one will do. fast food. neighbors— even your old high school teachers. “Closing arguments” at the end of each chapter summarize the key content. family members. Feel free to collaborate on these with colleagues. and X-treme competition all share a common bond. roller ball pens. Your Five-Minute PhD grants you the power of vector analysis. Breakthroughs in every one are driven by disciplined observation. textiles. measurement. surgery. How to Read This Book You can speed read this book in about a week. Welcome to the universe of Profit Signals. health care. analogies and activities are presented in a particular sequence for good reason. or sport. If possible. Call it vector power if you will. Read the captions to these exhibits. To get the “big picture” quickly. 2003 . mainstream business thinking. you will be able to systematically quantify and prioritize the effects of multiple factors on any manufacturing. you may want to read it again at a more leisurely pace. electricity. friends. © M. scuba diving. skim the illustrations. Its principles run deep and far beyond rote. music. So it is best to read the book front to back. electrocardiograms. routine. Chapter 1: The Five-Minute PhD – The opening chapter lets you earn your PhD in evidence-based decisions. standards of evidence. and analysis. telecommunication. computers. agriculture. movies. Boyles. the recording of data in an orderly data matrix fashion. gourmet ice cream. aviation.20 Premise Pick a profitable 21st Century product. The cornerstone of evidence has stood the test of time. After this initial overview. oil. improved business decisions. Once you get a handle on profit signals. skiing. The qualities of almonds. The ideas. service or financial process. All Rights Reserved. Daniel Sloan and Russell A. pharmaceuticals. windows.

Improve. Daniel Sloan and Russell A. In this chapter we trace the history of the cost- accounting variance analysis and show how to improve it with vector analysis. It reviews the traditional Six Sigma tool set.000 pay off.14 Dr. Chapter 4: Case Studies – We each have more than 20 years of consulting experience in the field of evidence-based decisions. project selection criteria. We review the traditional Six Sigma breakthrough project cycle: Define. process maps and financial model graphs.) In the early 1920s a genius named Ronald Fisher discovered how to apply the New Management Equation to identify profit signals in raw data and quantify strength of evidence. middle managers and line workers have signed affidavits and testified to the value of our work. this chapter has all the basics you need to know. Control (DMAIC). These include: • Improving the quality of state government customer services with a $525. (The New Management Equation is easier to say and spell than Pythagorean Theorem. Analyze. Fisher’s method is the international standard for quantitative analysis in all professions save two: accounting and business management. Premise 21 Chapter 2: Standards of Evidence – We review the distinction between story telling and evidence. All Rights Reserved. Boyles. endorsed one of our three-dimensional analysis books in 1997.P. George E. Our results have been published in peer-reviewed textbooks. It is also sweeter to the ear. We cover organizational guidelines. © M. Chapter 3: Evidence-based Six Sigma – If you are new to Six Sigma. CEOs. Box. 2003 . This chapter’s inside joke and secret handshake are that a Greek named Pythagoras invented the “New Management Equation” 2500 years ago. 15 In this chapter. a Fellow of the Royal Society and elected member of the Academy of Arts and Sciences. we tell a few of our favorite breakthrough project stories. Measure. You will learn the difference between vector analysis applied to a data matrix and spreadsheet calculations applied to arbitrary clusters of numbers.

• Comparing three ways of doing things. • Comparing two ways of doing things.000. equaling a grand total of $3 million. it is just another way to use the New Management Equation. Is this something new and different? No. You will tackle the following challenges facing the Corrugated Copter Company: • Establishing baseline performance metrics. profits.2 percent gain over the prior year’s performance. © M. this management team is up to speed on regression analysis. Fortunately. • Doubling the productivity in a vinyl extrusion process while reducing the product material costs by 50% in three months time. Chapter 6: Predicting Profits – Corrugated Copter managers want to be able to accurately predict flight times. • Comparing 256 different ways of doing things.22 Premise • Reducing the days in accounts receivable by 30 days with a 14-day project and a $425.000 bottom line impact. Bottom line value for this company for each of the next three years is $1 million. • Dramatically improving the patient outcomes in cardiovascular surgery while putting $1 million in additional profits on a hospital’s bottom line. It is vector analysis applied to a data matrix. You will learn how to expose the profit signals in your own data and represent them with bamboo skewers and Sculpey Clay®. • Tool grinding breakthroughs worth $900. they could confidently take better advantage of market dynamics. All Rights Reserved. Chapter 5: Using Profit Signals – This chapter presents the fundamentals of vector analysis with a few pages of reading and a physical model. and other important things. • Improving the operations of a hospital’s Emergency Department (ED) with a gross margin of $18 million for a 38. 2003 . Boyles. If they could improve the quality of their predictions. costs. Daniel Sloan and Russell A. inventory.

Premise 23 Chapter 7: Sustaining Results – At Corrugated Copter best business results always means earning the greatest revenue with the least expense. All Rights Reserved. There is a complete bibliography of the essential evidence-based decision bookshelf. Daniel Sloan and Russell A. We trust this information will serve as your outline for future study. Chapter 8: The Three Rs – In its time. Appendices – Here you will find a glossary of Profit Signal terms that will help you learn the language of evidence-based decisions. © M. 2003 . It is responsible stewardship. Are these new and different? No. Boyles. They are vector analysis applied to a data matrix. productivity and profitability. the Six Sigma business initiative created new breakthroughs in quality. They are just other ways of using the New Management Equation. A team of Corrugated Copters leaders has proposed an education system that would render their Six Sigma bureaucracy obsolete. The team learns to monitor and perfect their production processes through process capability studies and process control charts. This is more than a politically correct platitude. Corrugated Copters now believes traditional Six Sigma organizational ideas are outdated. We have also included a Profit Signals Black Belt Curriculum and production information on the Six Sigma tools we used to write and produce this book.

G. Boston. Thomas. Boston. Negotiating Agreement Without Giving In. Industrial Management. 5. Daniel Sloan and Russell A. Cost Accounting to Aid Production – II. John Wiley & Sons. 1918. 5. Page 431. Eighth Edition. No. 8 Anthony. Fisher. Page 941. Ray H. Roger. McGraw-Hill Irwin. William. 10th Edition. Charter. © M. R. 7 Garrison. Volume LVI. Joan Fisher. James S. November. 10 Royall. Cost Accounting to Aid Production – II. 2003 . Charter. G. All Rights Reserved. Relevance Lost. Chapman & Hall. Charter. Industrial Management. 1978. 3 Harrison. G. New York. Volume LVI. The Engineering Magazine. 2003. Managerial Accounting.. No. 1989. Boyles. Eric W. A likelihood paradigm. Statistical Evidence. December. Cost Accounting to Aid Production – I. and Kaplan. 1987. and Noreen. and Ury. Robert S. Richard. R. New York. 5 Harrison. Standards and Standard Costs. Industrial Management. Standards and Standard Costs..24 Premise Endnotes 1 Box. Homewood. Harvard Business School Press.A. 6 Johnson. Life of a Scientist. 1997. The Rise and Fall of Management Accounting. October. Life of a Scientist. and Reece. No. Accounting: Text and Cases. New York. Irwin. Robert N. 9 Fisher.A. Joan Fisher. The Engineering Magazine. 2 Box. 4 Harrison. Getting to Yes. John Wiley & Sons. 1918. H. The Engineering Magazine. 1978. 5. New York. Fisher. Standards and Standard Costs. Volume LVI. Penguin Books. 1981. 1918.

13 Martin’s Press. Daniel and Torpey. © M. Milwaukee. 1997. ASQ Quality Press. St. Nature and Origin of Standards of Quality. Page 8. Critical Path. Sloan. For a summary overview please read: Barney. All Rights Reserved. ASQ Quality Press. 12 Six Sigma is a registered trademark and service mark of Motorola Incorporated. New York. M.com/pdfs/Mot_ Six_Sigma. “Motorola’s Second Generation. 2003 . Matt. Premise 25 11 Shewhart. January. 1995. Success Stories on Lowering Health Care Costs by Improving Health Care Quality. Volume xxxvii. The Bell System Technical Journal. Using Designed Experiments to Shrink 15 Health Care Costs. May 23. Boyles. F. Milwaukee. 14 Sloan. M. 1958. Daniel.” Six Sigma Forum Magazine. Daniel Sloan and Russell A. pages 13-16. Buckminster. The Motorola web site is a recommended resource for researching this history of Six Sigma. Jodi B. number 1.pdf Fuller. 2002. http://mu.motorola. Walter A.

26 Premise © M. Daniel Sloan and Russell A. All Rights Reserved. 2003 . Boyles.

It can be earned by anyone who is willing to work at it. medical doctors. professions and authority remain secure. Boyles. engineers. Memories make life rich and rewarding. We all acquire memories through the experience of our everyday lives. and executives don’t own the lock and key to data analysis. mathematicians. Anyone can learn to do a vector analysis. What was once the high water mark of postgraduate study is now as simple as a Google web search. managers. You don’t need a certificate on your wall to analyze data. Knowledge and its application are the taproots for professional stature and income. They applaud the pointed question. economists. people can and do effectively solve more of their own problems.Chapter 1 The Five- Minute PhD P hD’s. Knowledge and information challenge authority. but they rarely bring innovation to our © M. Daniel Sloan and Russell A. Memories can teach. They can be disrespectful of bureaucracy and hierarchy. 2003 . The Five-Minute PhD is a democratic degree that exemplifies our age. This is fun. All Rights Reserved. Few are eager to debunk the presumption of specialized knowledge that justifies position and paycheck. With knowledge and information. scientists. So long as information and knowledge remain shrouded by jargon. They reward the cross-examination of high priests and presidents. Solving one’s own problems saves time and money. statisticians. Companies that know how to solve problems quickly make more money than those that don’t.

“two cubed.28 Five-Minute PhD work places. You will now learn how these three basic disciplines— experimentation. turned into numbers. music is played. products are manufactured. and analysis—work. Boyles. measurements. all through the use of numbers. Heartbeats is the measured response (dependent variable) in the experiment. Start Your Stopwatch Now Raw data contain information. Table 1 contains all eight possible combinations of a three-factor experiment with two levels for each factor. medical treatments are rendered and services are delivered. In our digital world all information can be. For convenience. come from disciplined observation by design. This is called a 23 (two raised to the third power) experimental design. Table 1 arrays the eight observations in an economical experiment. backpack weight bearing (y column). airplanes fly. When time and money are valued. The factors1 in this three-dimensional experiment are gender (x column). rather than an equation. © M. we obtain new knowledge only by applying the basic disciplines of experimentation. each factor was set at only two levels. experiments must be sized with economy in mind. so it is usually pronounced. and prove that they do work. and is. and analysis. and activity (z column). All Rights Reserved.” Think of tea with two cubes of ice. Daniel Sloan and Russell A. we get no new knowledge from casual experience.” establishes the order in which eight observations were made. The low setting is coded -1. Experimental data. The high setting is coded +1. 2003 . “Two raised to the third power” is a mouthful. The first column in the array labeled “Experiments Called Runs. Companies that apply knowledge and intelligence make more money than those that depend on memories. and the idea will be more refreshing. Except for an occasional stroke of dumb luck. observation. in less than five minutes. Good information leads to reliable predictions. With the sole exception of pure mathematics. Telephones work. Neither previous experience nor training nor calculations are required. observation.

this array does in fact create a cube. As you can see in Figure 1. Come back to this illustration after you complete your PhD. All Rights Reserved. Figure 1 The ideal data matrix forms a cube.) These © M. Five-Minute PhD 29 Table 1 The cube or “design of experiments” (DOE) array is an ideal data matrix. Daniel Sloan and Russell A. or runs. The clock is ticking. For each of the eight experiments. the resulting number of radial artery heartbeats were measured and recorded. (You can feel your own radial artery pulse by touching the inner aspect of one of your wrists with the index and middle fingers on the opposite hand. 2003 . Boyles.

Are you finished? Check your watch.” Please turn your attention to the pattern of the response heartbeat measurements in the far right column. doctoral- level vector analysis in less than five minutes. you are at the top of your class. in addition. If you concluded that aerobic exercise has the strongest effect on heartbeats. This is an example of “disciplined observation. the measurement 70 for Run #1 was for a sitting man who had no weight in his backpack. Each of the eight response measures is the output from a unique combination of the three factors.30 Five-Minute PhD measurements are arrayed in the far right column of this data matrix. When you have answered all four questions in the list. you were able to correctly identify one main effect (activity). We bet we were right. Take time to look at the evidence patterns in the data matrix. Boyles. All Rights Reserved. For example. Your eyeball vector analysis of eight numbers was accurate. Go back. Business Art and Science By using a special kind of row and column array called a data matrix. you have earned your Five-Minute PhD! If you noticed that carrying a 50-pound weight in a backpack increases the number of heartbeats for aerobic activity. If. We predicted that you could successfully complete a three-dimensional. you graduated Cum Laude. Consider the following questions: • Which combination of variables produced the two highest heartbeats rates? • Which combination produced the two lowest heartbeats rates? • Which variable appears to have the least effect on heartbeats? • How would you predict future outcomes from similar experiments? Please pause now and stop reading. continue. one inert factor (gender) and a two-factor interactive effect (the combination of activity and backpack weight). you concluded that gender doesn’t really make much of a difference when it comes to the number of heartbeats. but not for sitting. © M. Daniel Sloan and Russell A. 2003 .

The hallmark of Ronald Fisher’s genius was his ability to visualize n dimensions. With relatively small amounts of data framed in a data matrix. 2003 . inexpensive software effortlessly applies vector analysis to any data matrix. software applications create three- dimensional graphs annotated with accurate predictions. Five-Minute PhD 31 Consider the economies of using this technique to solve business problems. Fisher’s vector analysis is the elegant simplicity that underlies myriad. This is the vast Generalization of the mathematical/scientific variety we mentioned. © M. We explain the details in Chapter 5. it is true. Science-fiction writers use the mathematical term hyperspace when they need a word to describe faster-than-light travel. All Rights Reserved. hyperspace was correct. The eight numbers in the far right hand column of the 23 data matrix in Table 1 actually form a single entity. seemingly unrelated analysis techniques. Hyperspace is actually the mathematical term for a space with four or more dimensions. These vectors are the basis for profit signals. This applies to any manufacturing. service or health care process. Yes. Nowadays. It ranks them by importance and determines the strength of evidence. Software automatically calculates the profit signals. In other words. Daniel Sloan and Russell A. Please keep your hands and arms inside the analysis rocket. visualizing more than three dimensions is out of the question. gender. This entity is an eight-dimensional vector! You have now entered hyperspace. Then. Boyles. Hyperspace is not as easy to accept as a free ride on Disney’s Space Mountain. You succeeded in analyzing the three-dimensional experiment in Table 1 because you were able to visually compare the eight-dimensional vector for heartbeats to the eight- dimensional vectors for activity. You are absolutely right. fasten your seat belts. So. with the grace of a high technology thrill ride. and backpack weight. The evidence we now have about our universe confirms that Fisher’s vision of n- dimensional.2 This was Imagineering at its very finest. For most of us. business. Imagine the possibilities. you can simultaneously quantify and prioritize the effects of several process variables.

Aerobics) will produce an average heartbeat of about 188. Those who enjoy a stout beer now and again have been thankful ever since. No Weight. front corner of the cube in Figure 2. All Rights Reserved. +1) or (Female. This predicted value labels the upper. Figure 2 Vector analysis applied ������ ������ to a data matrix gives analysts the power of three-dimensional graphics.25. -1. He applied these principles to solve difficult. economical sets of data.75. © M. used these statistics at the Guinness Brewery. New vector analysis users are often amazed at the accuracy of predictions based on cubic and higher-dimensional experiments.32 Five-Minute PhD For example. Boyles. important problems using small. Sitting) will produce an average heartbeat of about 71. Ronald Fisher conducted the first cubic and higher- dimensional experiments in 1919. back corner of the cube in Figure 2. Working at the English Rothamsted Experimental Station. +1. the student of Fisher who first conceived the theory of statistical inference for small samples in 1907. William Gosset. ������ ������ ������� Activity � ����� ����� ������� ���� ������ ������ Repeated experiments at the setting of (+1. 50-pound weight. repeated experiments with the setting of (-1. You intuitively used data matrix and vector analysis principles to interpret the data in the Five-Minute PhD experiment. left. -1) or (Male. right. 2003 . Daniel Sloan and Russell A. �� ����� The differences in appearance ����� between this illustration and richer ones elsewhere in our book can be explained: the superior tables and ��� illustrations were created using vectors. This predicted value labels the lower. You will discover this for yourself as you complete the exercises in this book.

y-. This physiological phenomenon is called a spatial vectorcardiographic loop. space technology. biotechnology. computing. finance. 2003 . Boyles. The plus (+) and minus (–) signs for Rhesus blood groups symbolize vector analysis reference points. Vector analysis is used to coordinate all commercial jet landings at Orlando International. Five-Minute PhD 33 Eighty years of revolutionary advances in agriculture. The x-axis refers to the sagittal. Figure 3 The 23 cube you used to analyze heartbeats is identical to EKG theory and Rh+/Rh. the x-. The lower. drew a beautiful picture of a vector analysis (Figure 3). It is proven. Vector analysis applies to everything.4 Frank Netter. It is used to graph voltage variations resulting from the depolarization and repolarization of the cardiac muscle. 3 It ought to be used to create financial statements. All Rights Reserved. medicine. of the body. In Netter’s drawing. and z-axes are labeled using medical terminology. the Norman Rockwell of medical illustration. manufacturing.5 © M. Daniel Sloan and Russell A. It is used to describe Einstein’s special and general theories of relativity. horizontal y-axis is illustrated while the upper y-axis plane is implied. and transportation support Fisher’s mathematical/ scientific Generalization.blood groups. The back “frontal” plane is the illustrated z-axis plane. communications. or side planes. It is practical. information technology. It was used in 1908 by Willem Einthoven to create the electrocardiogram (EKG).

2003 . profitable applications of vector analysis to a 23 data matrix. Daniel Sloan and Russell A. Consider the vast Generality. So imagine. Table 2 The cube experiment works for any process in any system. and the enormous profit potential of this single tool.34 Five-Minute PhD The EKG made it possible to observe. Once you have performed the disciplined observations and recorded the measurements demanded by the cube’s data matrix. measure. Table 2 lists a few of the thousands of proven. All Rights Reserved. The patterns that emerge from the EKG vector analysis are critical to the prediction of a beating heart’s behavior. © M. Take time to write down factors (inputs) and responses (outputs) that could help you make more money. Knowledge produced by this Nobel Prize-winning achievement led to the creation of the most profitable niche in American medicine—cardiovascular care. Boyles. and graph the heart’s electrical impulses over time. The only limitation is imagination. Revisit this illustration when you read the Six Sigma case study on “beating heart” Coronary Artery Bypass Graft (CABG) surgeries in Chapter 4. the eight-dimensional vectors—especially the all- important profit signals—will lead you directly to the most profitable solution.

2003 . loss. weak Profit N Signal indicate no statistically O NOISE IATI significant effect. strong Noise vector and a short. time. weak” profit signal vector and a “long. A “short. Five-Minute PhD 35 Profit. inventory turns. AV E RA GE IT R OF AL P GN SI © M. and sales volume—any response you can measure—depends on many factors and the interaction of those factors in your business. taxes. some are not. Figure 5 A long. strong Profit Signal with a short. statistically significant effects and reliable predictions (Figure 5). statistically insignificant effects and unreliable predictions (Figure 6). Complexity is the rule. strong” profit signal vector and a “short. RAW DATA Figure 6 A long. Some of these are under your control. The variation is DA TA VAR most likely due to Chance. The ratio of the length of the profit signal vector to the length of the noise vector quantifies the strength of evidence in the data. strong” noise vector indicate small. weak” noise vector indicate large. Boyles. weak Noise vector means there is a statistically significant effect. A “long. Daniel Sloan and Russell A. The only way to distinguish profit signals from noise is to apply vector analysis to a data matrix. All Rights Reserved. productivity. not the exception.

All Rights Reserved. The cube is three-dimensional. crude. It is not a viable business strategy for the 21st Century. time- consuming. it has three factors. Trial and error is expensive.7 Despite the fact that we inhabit a world of three physical dimensions. © M. 2003 . It has served as a keystone of professional knowledge and profitability since 1630 when Rene Descartes introduced the method for three-dimensional thinking. Daniel Sloan and Russell A. and ineffective.36 Five-Minute PhD Data in the matrix must be obtained through a process of disciplined observation. there is no reason to limit ourselves to three- Table 3 Multi-dimensional experiments improve profits in every industry. Boyles.6.

Six Sigma Black Belts. Disciplined. The analogies between Picasso’s and Fisher’s cubes are intriguing. Table 3 lists a few n- dimensional experiments with n ranging from 2 to 6. Variance is a statistical measure based on the squared length of the variation vector. Typically. Picasso and Braque aimed at presenting data as perceived by the mind rather than the eye. We are stuck with it. To illustrate. We will discuss this further in Chapter 2. high school teachers. multi-dimensional observations and vector analysis reduce the financial risk associated with every important business decision. a variance is a difference between an actual value and a standard or budgeted value. 2003 . “objects were deconstructed into their components…. college professors. more than any other members of an organization. they can acquire this knowledge in just four days of accelerated.”8 In Picasso’s original Analytic Cubism. Boyles. “Every aspect of the whole subject is seen in a single dimension. Pablo Picasso and George Braque created a new art form called Analytic Cubism. Profit Signals Only a few years before Fisher used the cube and higher- dimensional experiments to dramatically increase profitable crop yields in England. hands-on training. it was used more as a method of visually laying out the FACTS…”9 Fisher explained his model and methods using virtually identical words. statisticians. He referred to vector analysis as the Analysis of Variance. Fisher’s definition pre-dates the accounting definition by forty-some years. They. Business leaders must not excuse themselves from mastering this knowledge or the skills to go with it. In cost accounting. so we use it. Five-Minute PhD 37 dimensional experiments. In Fisher’s terminology. To do so is to gamble the future of their companies on needlessly risky decisions. Daniel Sloan and Russell A. Senior managers and corporate directors are knowledge workers. All Rights Reserved. An ANOVA © M. spreadsheets and statistical programs often employ the hideous acronym ANOVA for Analysis of Variance. need to know how these tools work.

Y and Z variables. one for each combination. a cube. we focus on the component of greatest immediate interest. Boyles. All Rights Reserved. Table 4 The coded version of the Five-Minute PhD cube As we saw in Figure 1. form the edges of a three-dimensional solid. However.38 Five-Minute PhD “deconstructs” a data vector into the basic pieces essential for evidence-based decisions: Raw Data = Data Average + Profit Signal(s) + Noise We will discuss and illustrate various aspects of this vector equation in subsequent chapters. these could be any measurements of interest to you. Daniel Sloan and Russell A. Actual heartbeat data are used in the response column. the three factors in a cube experiment experiment. Table 4 contains a coded version of the Five-Minute PhD cube experiment. the data matrix for a cube experiment has eight combinations. In other words. These factor combinations and measurements correspond to the eight corners of the cube. The response column contains eight measurements. In this sense. Actual factor names are represented as generic X. For now. © M. a cube experiment is three-dimensional. here comes hyperspace again. As a Generalization. the Profit Signal. The numbers –1 and +1 are traditionally used to designate low and high levels of each factor. 2003 .

. opposing planes corresponding to the –1 and +1 levels of the most important factor Z. We can use the cube to create three-dimensional representations of the eight-dimensional profit signal vectors in a cube experiment. Activity. 190) minus (Average of 70. Hyperspace—the real one rather than the realm of Luke Skywalker—is a very big idea. Figure 7 shows shaded.0 = 83. 88) = 161. Overall Z effect = (Average of 140. 136. Boyles. These two opposing planes represent the eight-dimensional profit signal vector for the overall effect of factor Z. All Rights Reserved. 2003 . For example.78. These planes correspond to the grouping of the eight response measurements shown in Table 5. Consider what Fisher’s ingenious. Acquiring the knowledge and mastering hyperspace analysis skills is well within everyone’s intellectual reach.10 Figure 7 The opposing planes represent the eight-dimensional profit signal vector for the overall effect of factor Z. Daniel Sloan and Russell A.5 . This main effect is defined as the difference in the average response values for Z = -1 (Sitting) and Z = +1 (Aerobics). genius-level model suggests. Activity. It is beautiful art and art is the dream of a life of knowledge. Accepting this beyond-belief reality is easier said than done. This is a case of a “long/strong” Profit Signal and “weak/short” Noise.5 © M. 68. 86. Five-Minute PhD 39 The corners of the cube give a three-dimensional representation of an eight-dimensional data vector. 180.

. We reached these conclusions by comparing the column of response measurements in the data matrix to the columns representing the factors. This is not the case. Figure 8 Opposing planes representing the eight-dimensional profit signal vector for the main effect of factor Y. and the related concept of statistical significance. All Rights Reserved. The average of the back plane is 83. we must also quantify the strength of evidence for each profit signal. 2003 . In the Five-Minute PhD experiment. In actual practice. If Z (activity) had no effect.40 Five-Minute PhD Table 5 This grouping of the eight response measurements corresponds to the opposing planes in Figure 7. Table 5 displays in a spreadsheet format the two groups of measurements from the original Five Minute PhD experiment corresponding to the opposing planes in Figure 7. in Chapter 2. Boyles. the average response on the front plane of the cube would roughly equal the average response on the back plane. Daniel Sloan and Russell A. This is a case of a “long/strong” Profit Signal and “weak/short” Noise.5 heartbeats larger than the average of the front plane. activity and backpack weight had noticeable effects on heartbeats. © M. We will discuss this. Z had a large effect.

Pairs of planes on the cube can also represent interactive effects. Plus. one of the perpendicular planes in Figure 10 contains the four corners where X and Z have opposite signs (X × Z = -1). activity and backpack weight had an interactive effect. just like they do when good guys finally win on the big screen. For example. but not the number of sitting heartbeats. they will be more entertaining. All Rights Reserved. opposing planes representing the profit signal vectors for the main effects of factors Y and X. 2003 . Figure 9 Opposing planes representing the eight-dimensional profit signal vector for the main effect of factor X. Increasing the weight affected the number of aerobic activity heartbeats. © M. These planes represent the eight-dimensional profit-signal vector for the interactive effect of X and Z. Boyles. When the effect of one factor depends on the level of another. defined as the difference in the average response values for X × Z = -1 and X × Z = +1. If you think about Star Wars while you mull over these images. but they are perpendicular rather than parallel. in the Five-Minute PhD experiment. people do get up and cheer at the end of a multimillion-dollar breakthrough Six Sigma project. they are said to have an interactive effect. For example. The other plane contains the four corners where X and Z have the same sign (X × Z = +1). Daniel Sloan and Russell A. Five-Minute PhD 41 Figures 8 and 9 show the shaded. believe it or not.

the average response on each plane would be the same. one plane would have a much larger average than the other. perpendicular planes representing the profit signal vector for the interactive effect of factors X and Y. Figure 11 shows the shaded. If X and Z had a large interactive effect. Figure 10 Perpendicular planes representing the eight- dimensional profit signal vector for the interactive effect of factors X and Z. © M. 2003 . All Rights Reserved. perpendicular planes representing the profit signal vector for the interactive effect of factors Y and Z. Daniel Sloan and Russell A. Boyles. Figure 12 shows the shaded. Figure 11 Perpendicular planes representing the eight- dimensional profit signal vector for the interactive effect of factors X and Y.42 Five-Minute PhD If X and Z had no interactive effect.

All Rights Reserved. da Vinci. Daniel Sloan and Russell A. once in each of the six profit signal vectors shown above! This is a lot of work for only eight little numbers to do. © M. Alexander Calder. and a Fellow of the Royal Society named Sir Ronald Fisher. Orville and Wilbur Wright. For example. profit signals mean they can eliminate at least 83% of the data collection and storage costs incurred with primitive trial and error methods. These familiar geometric models convey esthetic beauty and analytical power. This bottom-line result is enhanced by the fact that vector analysis gives you the right answers to your most pressing business problems. Eighty-three percent is not a typographical error. It is simply a fact. Guglielmo Marconi. Greismeyer at Centerville High. This is the geometry of Michelangelo. The larger the number of factors. This data-recycling phenomenon is a characteristic of all cubic and higher-dimensional experiments. Einstein. Boyles. 2003 . Five-Minute PhD 43 Figure 12 Perpendicular planes representing the eight- dimensional profit signal vector for the interactive effect of factors Y and Z. For most companies. Galileo. This is not the 10th grade geometry taught by Mr. They allow our three- dimensional eyes to make some interesting observations. Data Recycling The three-dimensional cube diagrams provide a looking glass into eight-dimensional hyperspace. have you noticed that all the corner values are used repeatedly? Every data point appears six different times. the greater the savings.

better known to college students as ANOVA. manufacturing. “An analogy is like a comparison. Daniel Sloan and Russell A.14 . The data matrix and the New Management Equation. An analogy illustrates similarities between ideas or things that are often thought to be dissimilar. Since 1935 the application of vector analysis to data matrices. It was the foundation of science at the turn of the 20th Century when Einstein created his special and general theories of relativity.13 . Once data are entered into a data matrix. determines the strength of evidence and graphs the predicted values. Boyles. vector analysis has been the path to credible. 2003 . it is simply Profit Signals that come from the New Management Equation. a computer automatically calculates the profit signals. In the 1980s it was repackaged as one of many Six Sigma tools.11 It is the foundation of science at the turn of the 21st Century. All Rights Reserved.15. c2 = a2 + b2. We encourage you to use analogies to accelerate your learning.16 In those days this tool set was called the Design of Experiments. health care and process industries. It was the foundation of science in the 17th Century when Galileo proved that the earth revolved around the sun. It even explains the results. quantitative evidence.44 Five-Minute PhD The Full Circle of Data Discovery We are back to where we started in the Premise. parables and old-fashioned story telling are the most effective tools for teaching people © M. engineering. Vector analysis points to the entire family of common statistical distributions.” We have found that analogies. has produced huge financial returns in agriculture.12 . Since 1920. To quote one of our past students. form the backbone of vector analysis. The New Management Equation Use your own imagination to conduct experiments to verify the insights you gained from your Five-Minute PhD. In 2003.

This crowd-pleaser will let other doctors of philosophy know that you know what you are talking about. Daniel Sloan and Russell A. the cube another paradox in evidence-based decisions. Since you are now a PhD. feel free to throw around phrases like “Hegelian Dialectic” during your conversations. Yes. and colleagues at work. Table 6 lays out two proven favorites. Boyles. Five-Minute PhD 45 Table 6 Like any true the principles of evidence-based decisions. This is yet Generalizaion. experiment is a Law of the Universe. Specifically. discuss the implications of these analogies. Have some fun practicing with your new PhD in After you have completed your experiments with family. 2003 . friends. All Rights Reserved. how can you use what you now know about a data matrix and vector analysis to save time and/or make more money? © M. universes of your own.

I solved the mystery of specific gravity. The Special and General Theory. 2003 . one can transform the earth and the heavens into a vast design of intricate configurations. By comparing the weights of solids with the weights of equal quantities of water.”18 © M. Relativity. c2 = a2 + b2. I have found it. By using Euclid’s magical formula. and if there were a computing machine 2500 years from now that ran a vector analysis program with a data matrix. in the country there are two kinds of roads— the hard road for the common people and the easy road for the royal family. Boyles.” Albert Einstein: “The Gaussian coordinate system of Chance variation is a logical generalization of the X. Z Cartesian coordinate system. But please isn’t there a shorter way of learning geometry than through your method?”17 Euclid: “Sire. It is written in the language of mathematics. But. Daniel Sloan and Russell A. and it characters are triangles. Now if I were to speculate a bit. circles. all might be able to travel an easier road. I have found it. There is no royal road for learning.46 Five-Minute PhD Closing Arguments The following testimonies were transcribed in various historical hearings and trials about the Five-Minute PhD. But this book cannot be understood unless one first learns to comprehend the language and read the letters in which it is composed. and other geometric figures without which it is humanly impossible to understand a single word of it. Y. Archimedes: “Eureka.” Galileo Galilei: “Philosophy is written in this grand book the universe which stands continually open to our gaze. I used the data matrix cube with a vector to suggest the passage of time in my 1916 best seller. “Euclid. in geometry all must go the same way. All Rights Reserved. you made the impossible possible by the simplest of methods. A Simple Explanation that Anyone Can Understand.

”19 Walt Disney: “I only hope that we never lose sight of one thing—that it all started with a mouse. All Rights Reserved. Tampa. 6 http://www. I use the X. the little fellow literally freed us of immediate worry. He provided the means for expanding our organization. Commissioned by CIBA. He spelled production liberation for us.uk/~history/ Mathematicians/Descartes. I use the same Cartesian coordinate system to pilot my aircraft. Born of necessity. Heart. New York. 1978.org/wm/paint/tl/20th/cubism. Heart.html 8 http://www. Edition V. The CIBA Collection of Medical Illustration.ac. 4 Netter. Volume 5. Boyles. Cover Inc. Volume 5. Disneyland will never be completed. Commissioned by CIBA. R. 2003 . Page 4. Turrell uses light in his search for mankind’s place in the Universe. The CIBA Collection of Medical Illustration..st-and. 1996.ac.”20 Endnotes 1 These factors are also commonly known as independent variables. 1969. Rapid Interpretation of EKG’s. Frank. Mr. Dale.bbsrc. 2 Box. 1969.A.uk/pie/sadie/reprints/ perry_97b_greenwich. 5 Dubin. and Z axes of light to achieve my objectives. John Wiley and Sons.dcs.pdf 7 http://www-gap. Fisher. Life of a Scientist. It will continue to grow as long as there is imagination left in the world. Frank.ibiblio. Y. Fisher. Five-Minute PhD 47 James Turrell is a hyperspace sculptor of international reputation.rothamsted. Daniel Sloan and Russell A. Joan. James Turrell: “I want to create an atmosphere that can be consciously plumbed with seeing like the wordless thought that comes from looking in a fire.html © M. 3 Netter.

Henry. A. 11 Einstein. A Clear Explanation that Anyone can Understand. Thomas. Page 90.about. 1967.. Crown Publishers. Biometrika. The Special and General Theory. Page 32. J. Albert. 1952.. Albert. R. 10: 507-521.artchive.com/artchive/P/picasso_ analyticalcubism. The Design of Experiments. A Simple Explanation that Anyone Can Understand. 14 Fisher. William G. and Thomas. R. Relativity. Garden City. Stuart. “Frequency Distribution of the Values of the Correlation Coefficient in Samples from an Indefinitely Large Population”. New York.org/art21/artists/turrell/ http://goflorida. Thirteenth Edition. 1941. 1952. Data Analysis and Model Building. 1921. 1915.P. New York: Hafner Press. Daniel Sloan and Russell A. 16 Box. New York. “On the Probable Error of a Coefficient of Correlation Deduced from a Small Sample. Hunter. 12Fisher. 18 Einstein. 2003 . Statistical Methods for Research Workers. The Special and General Theory.htm © M. 15 Fisher.pbs. 19 http://www. Relativity. New York: Hafner Publishing Company Inc. Crown Publishers. John Wiley and Sons. R. 1935.A. New York.html 10 Inscription on the southern ceiling of the rotunda leading to a James Turrell Skyspace installation at the Henry Art Gallery on the University of Washington campus. Statistics for Experimenters: An Introduction to Design. Living 17 Biographies of Great Scientists. 13 Fisher. All Rights Reserved.48 Five-Minute PhD 9 http://www. 1978. Ronald A. Garden City Books. George E. Boyles. Hunter. Dana Lee.com/library/bls/bl_wdw_ 20 waltdisney_quotes.A.” Metron 1: 3-32. Pages 4-5.

Labor is a tiny fraction of the total cost of doing business in these newly emerging competitive economies. Many must achieve this within the next five years or go out of business. Boyles. But evidence provides an operational basis for making decisions only if we have standards by which to judge the strength of evidence. If you doubt this possibility. “We’ve never asked ourselves that question before. Mexico. 2003 . and Malaysia—take your pick. Too often the answer is an uncomfortable silence or. Daniel Sloan and Russell A.” Evidence is the foundation for making better. This is a counter- © M. North American businesses. India. 2001 a good portion of our world-famous gridlock traffic jams vanished along with more than 30. For many old-school managers.000 jobs. is more important than achieving any business goal. more profitable business decisions. For the first time in history. As a result. Washington. All Rights Reserved. large and small. must find ways to reduce production and delivery costs by at least 30 percent. Since September 11. they are competing head-to-head with managers in developing nations like Brazil. China. staying inside their corporate cultural comfort zone. including profitability. visit Seattle. Demands for improved financial performance put old- school managers in a bind.Chapter 2 Standards of Evidence W hat are the objective standards of evidence your business uses to make decisions? We ask all new clients this question. with an atmospheric vacuum of evidence standards.

50 Standards of Evidence
productive and, given the painfully apparent need for jobs, a
socially irresponsible attitude.

Comfortable or not, spirited capitalism has put evidence-
based decisions on the map. Whether they know it or
not, vector analysis and standards of evidence are now on
every manager’s radar screen. The only question is who will
recognize and respond to the signals.

Poetry versus Science

Efforts to understand the world we live in began with story
telling. Stories thrive in many forms, oral and written, poetry
and prose. Stories convey values. They define and maintain
cultures, including corporate cultures. Stories evoke fear,
hope, joy, anger, sympathy, humility, respect, wonder and
awe. Stories build like pearls around grains of historical fact.
They tell us much, mostly about ourselves.

Stories are not laws. They do not, and are not intended to,
reliably describe historical facts or physical realities. Story
telling does have its place, but it can be at odds with science.
Story telling often involves tales of trial and error.

Scientific discoveries inspire as much wonder and awe as
any Paul Bunyan tale. But, the driving force behind science
is disciplined observation, experimentation and analysis.
The scientific method, which can be equated with Six
Sigma, embraces affirmative skepticism. This skepticism is
diametrically opposed to the innocence of credulity. Credulity,
or as some prefer to say naïveté, is the suspension of critical
thinking. Credulity allows us to experience the emotional
impact of a good story. Credulity makes Disneyland,
Disneyworld and the Epcot Center fun.

The tension between story telling and science dates to poet
John Keats’ criticism of scientist Isaac Newton’s prism.
Newton discovered that “white light” contains an invisible
spectrum of colored light. He made this spectrum visible by
shining ordinary light through a prism. The process Newton
used is called refraction. Refraction comes from a Latin word
which means to break up.1 If you have ever had an eye exam
for corrective lenses, your ophthalmologist or optometrist
used refraction to determine your prescription.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Standards of Evidence 51
Newton used his prism to create rainbows (Figure 1). Keats
was appalled. Newton ruined rainbows by explaining them.2

Glasses, contact lenses, fiber-optic cables, lasers, big-screen
TV and digital cameras work because Newton stuck to his
intellectual guns. We are glad he did.

The process of refracting white light into a visible spectrum
of colors is a form of vector analysis. We are not being overly
lyrical when we say that Ronald Fisher’s vector analysis
“refracts” data. Refraction makes profit signals visible. This is
essentially what you did to earn your Five-Minute PhD.

Figure 1 A diamond sparkles with
colorful data vectors refracted from
ordinary light.

For poets, this perspective is unwelcome. They are not alone
in this feeling. Again and again, we hear Keats’ critique
of Newton echoed in the protests of old-school managers
who reject profit signals as well as the process of disciplined
observation, experimentation and analysis.

“Scientific” Management

Managing any business is a challenge. Complexity arises
from materials, work methods, machinery, products,
communication systems, customer requirements, social
interactions, cultures and languages. The first step in solving
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

52 Standards of Evidence
complex business problems is to frame them in terms of a
manageable number of key variables.

Bottom-line profitability is the ultimate objective, but other
metrics must also be considered. Sales, earnings per share,
cost and time to develop and market new products, operating
costs, inventory turnover, capital investments and days in
accounts receivable are just a few. Profit signals from one
or more of these variables often demand timely, reasoned
responses.

Frederick W. Taylor mesmerized the business community
of his day with the 1911 publication of The Principles
of Scientific Management. Taylor aimed to explain how
any business problem could be solved “scientifically.” As
an engineer for a steel company, Taylor had conducted a
26-year sequence of “experiments” to determine the best
way of performing each operation. He studied 12 factors,
encompassing materials, tools and work sequence. He
summarized this massive investigation with a series of multi-
factor predictive equations.

This certainly sounds like science. Unfortunately, trying to
solve complex business problems with Taylor’s methods is
akin to surfing the Internet with a rotary phone. In his 1954
classic How to Lie with Statistics, Darrel Huff characterized
Taylor-style science as follows: “If you can’t prove what you
want to prove, demonstrate something else and pretend that
they are the same thing.”3

Taylor studied his 12 factors one at a time, holding the
other 11 constant in each case. 4 This invalidates his multi-
factor equations. One-factor-at-a-time experiments are so
thoroughly discredited, that they have their own acronym,
OFAT. It is physically impossible for OFAT experiments to
characterize multi-factor processes. OFAT experiments are
also notoriously time consuming. This is probably why it took
Taylor 26 years to complete his study.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Standards of Evidence 53
Cost Accounting Variance Analysis

Businessmen in the early twentieth century enjoyed
comparing themselves to Einstein, Marconi, Edison, the
Wright Brothers and other celebrity scientists of the day. G.
Charter Harrison, an accountant with Price, Waterhouse
and Company in London, chose Taylor as the celebrity
“scientist” he wanted to emulate. Harrison published a
series of articles in 1918 in support of his assertion that,
“The present generally accepted methods of cost accounting
are in as retarded a state of development as were those of
manufacturing previous to the introduction by Frederick W.
Taylor of the idea of scientific management.”

A tidal wave of popularity was carrying Taylor’s book to
best seller status. Harrison rode this wave. He advanced
“scientific” principles for cost accounting. He proposed that
“standard costs” be established for various tasks, and that
actual costs be analyzed as deviations from the standard
costs. This was an advance over previous methods. Harrison
went on to describe an assortment of things that could be
calculated from such differences, including “productivity
ratios.”

A 1964 Times Review of Industry article first used the term
variance to describe Harrison’s difference between actual
and standard costs.5 Perhaps old-school accountants and
managers thought “variance” sounded more scientific than
“difference.” They had good reason to do so. By 1964 Ronald
Fisher’s vector analysis solution to a wide variety of statistical
problems were widely known under his general term for
them, Analysis of Variance.

Analysis of Variance is the international gold standard for
quantitative work in virtually every profession. Prior to the
invention of Six Sigma in 1986, two notable professions were
the only exceptions to this rule: accounting and business
management.

By 1978, business journalists were using the phrase “variance
analysis” to refer to the examination of differences between
planned and actual performance. The expression persists in
today’s accounting textbooks: “The act of computing and
interpreting variances is called variance analysis.”6

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

54 Standards of Evidence
Needless to say, the cost accounting variance analysis of 1978
bore no relation to the Analysis of Variance invented by
Fisher some 58 years earlier. The elements of a standard cost
accounting variance analysis are shown in Table 1.
� �

� � � � � �

� � � � � � � � �


Table 1 Cost-accounting variance �
� � � � � � � � � �
report formats vary. The key element � � � � � � � � �
is a column labeled “Variance Ratio.” �
It is the signed difference between an � � � � � � � � � �
actual value and a standard, budgeted � � � � � � � � � �
� � � � � � � �
or forecast value, expressed as a �
percentage.7, 8, 9 � � � � � � � � � �
� � � � � � � � � �
� � � � � � � � �

� � � � � � � � � �

It is unfortunate that the word “variance” was redefined
in 1964 to mean a difference between actual and standard
values. There is nothing inherently wrong with analyzing such
differences. In fact, it is a good idea. The problem comes in
the type of “analysis” that is done with such differences, and
the actions “variance analysis” conclusions can lead to.

For example, the manager who is responsible for the $1,000
revenue “variance” in Table 1 will be asked to explain
himself or herself. After all, the result is 20% under forecast!
The explaining of this unacceptable negative variance occurs
at the monthly meeting of the Executive Committee.

This monthly ritual creates tremendous pressure to conform.
It subverts critical thinking. Managers are forced to develop
story-telling skills. A plausible explanation is produced. The
manager vows not to let this bad thing happen again. After
a month or two or three, the offending “variance” happens
again. A plausible explanation is produced. The manager
swears never to let this new bad thing happen again. And so
on.

The highest-paid employees in the company waste hours,
days and even weeks every month, grilling each other over
G. Charter Harrison’s 1918 productivity ratios. Objective
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

All Rights Reserved. “Accounting principles are man-made.10 In 2003. New laws force old ones to be revised or scrapped. The monthly cross-examination over variance analysis rather than Analysis of Variance (ANOVA) is an indefensible waste of time and money. The Executive Committee may as well try to produce rainbows without a prism. The authors of the book Relevance Lost: The Rise and Fall of Management Accounting. and © M.” Another. Over time. a body of evidence becomes so compelling it becomes a new Generalization or Law. By contrast. Boyles. Daniel Sloan and Russell A. Unlike principles of physics. Accounting versus Science Today’s Generally Accepted Accounting Principles (GAAP) are loose guidelines. “Businesses in this region hire graduates who know how to use cost- accounting variance analysis. Standards of Evidence 55 standards of evidence are nowhere to be found in their discussions. chemistry. more revealing explanation runs deeper. experimentation and analysis leads to improvements. It is every company’s greatest obstacle to evidence-based decisions and Six Sigma breakthroughs. She reasoned. Empirical laws of science are forced to evolve. winner of the 1989 American Accounting Association’s award for Notable Contribution to Management Accounting Literature. Occasionally. 2003 . judged the accounting profession to be functioning 70 years behind the times. Little has changed from those submitted to the United States government by a committee of Certified Public Accountants in 1932. an inexorable process of disciplined observation. The assistant dean of a leading business graduate school recently told us that her university continued to teach a core curriculum subject—cost-accounting variance analysis—that she personally knew to be false. in the words of a Harvard Graduate School of Business Administration textbook. that makes it 84 years behind the times! How could this happen? Perhaps it is a function of what the customers want.

All Rights Reserved. None of these well-intentioned initiatives were.” 13 This passage is eerily familiar to those of us who have watched businesses become captivated by one management fad after another: “Excellence”. that millions of people become simultaneously impressed with one delusion. and go mad in its pursuit. or are. they have their whims and their peculiarities.12 Charles Mackay described massive losses related to business practices just like today’s cost-accounting variance analysis. cost accounting principles cannot be tested for validity. 2003 . They have no objective standards of validity. their seasons of excitement and recklessness. “Zero Defects”. particularly bad in and of themselves. like individuals. “Activity-based Cost Accounting”. “Total Quality Management”. We find that whole communities suddenly fix their minds upon one object. we find that. There is no process of disciplined observation. © M. when they care not what they do. and run after it. “Zero-Based Budgeting”. Boyles. Until one-dimensional. nor can they be verified by observation and experimentation. “Management by Objective” and “Balanced Scorecards” are a few of the greatest hits. till their attention is caught by some new folly more captivating than the first. GAAP will remain wide enough to drive a Six Sigma tractor-trailer rig through. Daniel Sloan and Russell A.”11 In other words. He could well have been writing about 20th and 21st Century popular culture when he penned the chapter “The Love of the Marvelous and Disbelief of the True. “Re-Engineering”. experimentation and analysis to force improvements. Delusions and Bamboozles In his 1841 classic Memoirs of Extraordinarily Popular Delusions.56 Standards of Evidence the other natural sciences. cost-accounting arithmetic is upgraded to vector analysis applied to a data matrix. They simply lack the firm foundation and objective standards of evidence sound theory provides. accounting principles were not deduced from basic axioms.” “In reading the history of nations.

He would analyze data however he wanted. Eventually. We’re no longer interested in finding out the truth. even to ourselves that we have been taken. transparent. a superstition. Amazingly. doctors.” 15 All business leaders—plant managers. a kind of critical mass of delusion is established.14 Carl Sagan explained it this way: “One of the saddest lessons in history is this: If we have been bamboozled long enough. Generally Accepted Accounting Principles place no premium on truth or even facts. homeopathy. take your pick—manages to survive for a few years or decades. Boyles. That proven method is profit signal vector analysis applied to a data matrix. proven method of analysis based on objective. Daniel Sloan and Russell A. the path of least resistance is to keep on bamboozling. he would demand a certain freedom of expression. quantitative standards of evidence. Once you give a charlatan power over you. All Rights Reserved. Our proposal? Replace cost-accounting variance analysis with a reliable. billionaire CEOs—face this dilemma when they try to bring evidence- based decisions into their organizations. a poet of the Romantic Period. according to his prevailing emotions. According to a 1990s Professor Emeritus at the Harvard Graduate School of © M. Vector Analysis 101 John Keats. Once you have bamboozled the public. If he were an accountant today. The capacity for critical thinking erodes. lived in a world of pure expression. So the old bamboozles tend to persist as new ones rise. the stockholders. cost-accounting variance analysis. fad or fallacy—astrology. Standards of Evidence 57 Occasionally. 2003 . It’s simply too painful to acknowledge. He would present his data in free verse or any other format he desired. you almost never get it back. phrenology. or the employees of your company. They prize only internal consistency. He would assign to his analysis whatever weight of evidence felt right. The bamboozle has captured us. he would be granted all these freedoms today without so much as a raised eyebrow. we tend to reject any evidence of the bamboozle.

� � � � � �� ������ ������ ������ � � Ironically. There are � � � � no laws for arranging or analyzing �� ������ ������ ������ data. right triangles.) Just like Keats. Fisher actually was a scientist. “There are no prescribed criteria [for variance analysis] beyond the general rule that any technique should provide information worth more than the costs involved in developing it. All Rights Reserved. There are no �� ������ ������ ������ rules governing the interpretation � of rows and columns. multi-variable calculus and vector analysis. graphics.”16 The lack of any prescribed criteria for financial analysis explains why spreadsheets are so popular. Fisher developed the Analysis of Variance in a farm near London right around the same time Taylor and Harrison were promoting their “scientific” management and cost- accounting principles. The cells can � contain text. Fisher’s work was grounded in rigorous mathematics and physical reality: Cartesian coordinates. Evidence-based decision companies repeatedly demonstrate why this is a profitable choice. Unlike Taylor and Harrison. Pythagoras’ Theorem. Managers would do well to follow his lead. numbers. with statistical software they can do so immediately at virtually no cost. � � � � Table 2 A spreadsheet consists of �� �� �� rows and columns.17 It is an understatement to say the two definitions are significantly different. What better way to spice up the workday? The situation is quite different for a Law or a Generalization like Fisher’s Analysis of Variance. 2003 . plane and solid geometry and trigonometry. (See Table 2. Fisher coined the term Variance in 1918. In sharp contrast with the artificial guidelines of cost accounting. � � � � symbols or formulas. Boyles.58 Standards of Evidence Business Administration. They are as different as Generalization and © M. most people like being free to arrange their data in any way that suits their fancy. Daniel Sloan and Russell A. This was 46 years before its debut in the Times Review of Industry.

Fisher’s boss subtracted average annual production numbers from each other to “determine” which years were most productive. “It is the difference between lightning and a lightning bug. Fisher was hired to “examine data and elicit further information” from his employer’s database. Standards of Evidence 59 generalization. Like most people. 2003 . There the similarity ends. Daniel Sloan and Russell A. To quote Mark Twain. Table 3 shows a simple data matrix. The boss wanted to increase the annual yield in bushels of wheat. The rows of a data matrix represent records—the individual objects or events on which we have data. It is based on the length of the variation vector. “It took me a very short time to realize that he was more than a man of great ability. he was in fact a genius who must be retained. The columns represent fields—the variables measured or inspected for each object. In 1919. he wanted to make more money while working shorter hours and using fewer resources. Like a spreadsheet. Each column in a data matrix contains the measurements which are the data vector for the variable associated with the column. Fisher knew exactly how to help his boss achieve these objectives: apply a vector analysis to a data matrix. According to his employer. Fisher called his method “Analysis of Variance” because its purpose is to break up the variation vector into profit signal and noise components. at age 29.19 There were measurements recorded in rows and columns.” Instead of a difference between an actual value and a standard value. There are two variables measured on two objects. © M. Fisher’s Variance measures the degree of variability of a set of values around their average. Each object is represented by a particular coordinate or position in the vector. The number of rows is called the sample size. a data matrix consists of rows and columns. Fisher’s work defines today’s international standard for analyzing components of variation. All Rights Reserved. Boyles.”18 Fisher’s job was to re-evaluate a business report identical to the ones managers use for decisions today.

It is real. Daniel Sloan and Russell A. Fisher’s innovation was to think of the data matrix in a geometric framework. 2) for Variable 2. Boyles. 2003 . vectors are n-dimensional. Nevertheless. hyperspace will remain forever beyond our three-dimensional vision. The two coordinate axes correspond to the two objects. Figure 2 The two columns of Table 3. 4) for Variable 1 and (5. All Rights Reserved. Evidence-based decision companies use hyperspace to make more money in less time while using fewer resources. In general. � � � � �������������� �� ��� � � For example. The rows of a data matrix ����������� ����������� represent records—the individual ����� ����� objects or events we have data on. We are back into hyperspace. the data vectors in Table 3 are (3. a ��������� ��������� data matrix consists of rows and columns. Like the inside of a black hole.60 Standards of Evidence � � � Table 3 Like a spreadsheet. it is there. � � The columns represent fields—the � � � variables for which we have data. © M. plotted as vectors. �������������� �� ��� Each stack of numbers in a data matrix column is a vector. In this example the vectors are two-dimensional because there are two objects in the data matrix. where n is the sample size. These vectors are plotted in Figure 2 .

Examples of two-dimensional constant vectors are (1. 4) is closer to the vector of averages than (5. The closest constant vector is always the vector of averages. how do these vectors differ? Well. This means Variable 1 has less variability © M.5). It does seem coincidental that (3. 4) and (5.5.5). 0. 2) is (Figure 4). 2) have the same average. It is not a coincidence that 3. The dotted line in Figure 4 locates the set of all possible two- dimensional constant vectors. Figure 3 The shortest distance between a point and a line is along a path perpendicular to the line. For our data vectors. Standards of Evidence 61 Figure 3 illustrates the first basic rule of vector analysis: The shortest distance between a point and a line is along a path perpendicular to the line. it is a property of the physical universe.5 is the average of 3 and 4. So. It is a law.5 is the average of 5 and 2. 1). (2. but we did this on purpose. moving from the lower left point of origin to the upper right.5. A data vector close to its vector of averages has less variability than a data vector far from its vector of averages. It is not a coincidence that 3. Boyles. The first step in a vector analysis is to find the constant vector closest to the data vector. All Rights Reserved. 2003 . Daniel Sloan and Russell A. (3. the closest point on this line is (3. Figure 4’s center vector masks a long segment of this dotted line. This is no arbitrary accounting rule. a mathematical/scientific Generalization. 2) and (0. D1 and D2. Only a portion of the dotted line is visible at the upper right hand portion of the illustration. 3.

The constant vector closest to any data vector is the vector of averages. All Rights Reserved. V1 and V2 are the corresponding variation vectors. This “eyeball” analysis is just for illustration. for Variables 1 and 2. Daniel Sloan and Russell A. You can also tell this just by looking at the numbers in Table 3. Boyles. The length of the variation vector is directly related to the degree of variability in the data vector. How do we calculate the length of a vector? For this we need the second basic rule of vector analysis: The New © M. it is not recommended for your real data sets. Figure 5 identifies the variation vectors.62 Standards of Evidence than Variable 2. Figure 5 A is the vector of averages for both Variables 1 and 2. shown here in bold. V1 and V2 . Figure 4 The dotted line is the set of all constant vectors. 2003 .

Daniel Sloan and Russell A. (c2 = a2 + b2) Once again. Also. 2003 .k. Financial Engineering News was founded in 1997 to disseminate case studies. Boyles. the Pythagorean Theorem). Only the alphabetic notation differs from the New Management Equation. it is a property of the physical universe. this is no arbitrary accounting rule. the New Management Equation for Variable 1 is: (D1)2 = A2 + (V1)2 We can see in Figure 6 that (D1)2 = 25. Figure 6 The length of a vector is the square root of the sum of the squares of its coordinates. Now we can figure out the lengths of the variation vectors in Figure 5. the squared length of the data average vector A is: © M.a. All Rights Reserved. The square of the length of the long side of a right triangle is equal to the sum of the squares of the lengths of the other two sides. Using the letters in Figure 5. Standards of Evidence 63 Management Equation (a. In Figure 6 we use the New Management Equation to calculate the lengths of data vectors D1 and D2. The New Management Equation is so well known in professional financial and investment analysis circles that a bi-monthly newspaper.

This substitution is a grievous breach of statistical theory.52 + 3. All Rights Reserved.55 = 2. A sample standard deviation is symbolized in technical writing by the letter s. This final number. We already know that A is the square root of 24. The New Management Equation for Variable 2 works the same way: (D2)2 = A2 + (V2)2 We know from Figure 6 that D2 = 5. Please do keep your eyes on the right triangles in the illustrations.71.5 We can now plug these two numbers. 25 and 24.5 = 0.24.5. 0.05 = 24. is called the sample standard deviation for Variable 1.5 = (V1)2 V1 = square root of 0. Boyles. We can now plug these into the New Management Equation: 5.5 = 0. the sample standard deviation. s. the length of the variation vector for Variable 1.95. 2003 .64 Standards of Evidence A2 = 3. The Greek letter sigma (σ) refers to the standard deviation of a population.55 = (V2)2 V2 = square root of 4.13. is often casually referred to as “sigma” or σ.5 + (V2)2 4.39. but everyone who uses statistics does it.52 = 24.5. This is where Six Sigma gets its name. into the New Management Equation for data vector D1: 25 = 24.5 + (V1)2 25 . which equals 4.71. In Six Sigma practice. Daniel Sloan and Russell A. © M.392 = 4.952 + (V2)2 29.

Degrees of Freedom Don’t panic. Six Sigma values smaller variation because outcomes are more predictable. we certainly hope our pilot and co-pilot have this information at their fingertips. Boyles. Think of what follows as a mandatory Federal Communications Commission announcement on your National Public Radio station. It has to be here to ensure we are not breaking any Laws of the Universe. There is less waste and rework. they are given by © M. The clearest way to explain the subtraction of vectors is to give the vectors a vertical orientation. In either case.) We get the coordinates of the variation vector by subtracting the data average vector from the data vector. Standards of Evidence 65 The sample standard deviation for Variable 2 is 3 times larger than that for Variable 1! Variable 2 is 3 times more variable than Variable 1. or you can stay tuned. You can skip this section if you want. (Whenever our airplane takes off or lands. software takes care of all this stuff. The coordinates of the variation vector for Variable 1 are given by: For Variable 2. 2003 . Daniel Sloan and Russell A. All Rights Reserved. This is just background information. Everything just works better when the profit signals are large/strong and the noise is small/weak. Predictions are more accurate. Sometimes an analyst might want or need to know the actual coordinates of a variation vector. the way they appear in a data matrix.

Now let n stand for the number of objects in your data set. Boyles. 4. We express this by saying that a three-dimensional variation vector has two degrees of freedom.66 Standards of Evidence So far. We express this by saying that a two-dimensional variation vector has one degree of freedom. 5) or (5. for example (3. 2. It is your sample size. Suppose now we have a three-dimension data vector. All the vectors are now n-dimensional. Now. All Rights Reserved. 4. Because of this. 4). The more often you go there. here is an important Law of the Universe: The coordinates of a variation vector always add up to zero. Visits become more profitable. © M. 5). They become fun. This means that a two-dimensional variation vector is completely determined by its first coordinate. This means that a three-dimensional variation is completely determined by its first two coordinates. the first variation vector is: and the second is: In a three-dimensional variation vector. We are back into genuine hyperspace again. 2003 . Once again using the vertical data-matrix orientation. the less scary it becomes. The vector of averages for both of these is (4. so good. the second coordinate in a two-dimensional variation vector is always equal to the negative of the first coordinate. This is the same as the number of rows in your data matrix. Yes. the third coordinate is always equal to minus the sum of the first two coordinates. Daniel Sloan and Russell A.

In other words. Don’t blame us—it’s a Law of the Universe. there is no Noise. We now return to our regularly scheduled program of writing with an improved degree of simplicity. There is no Chance variation. they encourage managers to use Frederick Taylor’s thinking. There are no deviations from the average. Standards of Evidence 67 The last coordinate of an n-dimensional variation vector is always equal to minus the sum of the first n – 1 coordinates. All Rights Reserved. The upshot of all this is this: the standard deviation is exactly equal to the length of the variation vector only when n = 2. Daniel Sloan and Russell A. This violates a Law of the Universe. Boyles. At worst. we might say bar charts have a 50/50 chance of giving correct information because they consider only one of two aspects. “This bar is bigger than that bar is and I know the reason why because I am a scientific manager and I say so. as it usually is. Consequently. We will come back to this later in the chapter. A vector analysis forces us to consider both average and standard deviation. We express this by saying that an n-dimensional variation vector has n -1 degrees of freedom. 2003 . we have to divide the length of the variation vector by the square root of its degrees of freedom. At best. There is always Noise. This means that an n-dimensional variation vector is completely determined by its first n-1 coordinates. which is statistical variation. and also in Chapters 5 and 6. They are the “Gee Whiz” graphs in Huff ’s Lying with Statistics. When n is greater than 2. They present data in superficial ways. they frequently are used to misrepresent data. Bar Chart Bamboozles Bar charts and pie charts symbolize old-school management thinking as no other icon can. They are easy to use. The typical bar chart presents totals or averages with no consideration of variability. Variation is a physical property of objects and measurements.” © M.

The Marketing Manager would certainly want to take credit for this. Daniel Sloan and Russell A. consider the monthly revenue data in Table 4. This is a bit like trying to ignore gravity. whether or not they are true. Because Laws of the Universe are ignored. This poetic license gives everyone the freedom to take credit for good results. there is no way to tell whether the “trend” is a profit signal or noise. Boyles. This is a snapshot of data entered into a spreadsheet.68 Standards of Evidence As an example. Figure 7 Excel’s popular bar chart/ trend line combination is like Romantic poetry. Table 4 Monthly revenue for four years. 2003 . The upward trend looks very encouraging. © M. The annual totals are plotted as a bar chart in Figure 7. All Rights Reserved. All Figure 7 really does is graphically frame the differences between the annual totals.

it must follow the Laws of the Universe. It must follow the rules of vector analysis. Daniel Sloan and Russell A. or a company founder who created spreadsheet software. All elements must be available for review. including the raw data. The vector analyses illustrated below represent the international standard. comes from the status of the person telling the story rather than the evidence in the data. The credibility of the results portrayed by the chart. we will present two vector analyses that use only the four annual totals. There is no cross-examination of the reported results because it is considered poor form. This is a vector analysis in four-dimensional hyperspace. [C2 does equal A2 plus B2. because there are four data points. For purposes of illustration. it uses only the annual totals instead of the original monthly data. and the explanation for them. Quite simply. 2003 . Table 5 lays out the basic vector calculations for the sample standard deviation.06. Chief Financial Officer. Evidence is admissible if and only if the analysis method takes all aspects of the data into account.] © M. to question the President. the analysis method itself is held to high standards. 140. Boyles. not to mention career limiting. In corporate cultures that base decisions on objective standards of evidence. The data average vector has one degree of freedom because one number.14. Standards of Evidence 69 Corporate cultures that use cost-accounting variance analysis as the standard decision-making tool often use bar charts and trend lines to present “results” like Figure 7 based on data like that in Table 4. s. Managing Director. The analysis must have transparency.30 + 0.36 = 140. All Rights Reserved. For one thing. The lengths of the vectors are related by the New Management Equation. In this case s = 0. determines it. Anyone can ask any question because all the data are in view. the average of the four data points. There are several things wrong with the “analysis” in Figure 7. The first of these is given in Table 5. This leaves three degrees of freedom for the variation vector.

We must conclude that the deviations from the mean © M. All Rights Reserved. ���� ���� ����� ���� � ���� � ���� ���� ���� ����� ���� ���� ���� ������������������ � � ���� � ���� ��������������� ������ � ������ � ���� ��������� ����������������� � ������������������� �� ���� ������������������� ���������������������������� ���� Figure 8 shows the Normal distribution curve corresponding to a mean of $5. The dots just above the horizontal axis represent the four annual totals.14 million. The syntax for the Excel calculation is: = SUMSQ(cell range) �������� ������������ ��������� Table 5 Basic vector analysis of the ������ ������ ������ four annual totals (millions of dollars). The squared lengths of the vectors were calculated by using the cell function SUMSQ. This is appropriate because the squared length of a vector is the sum of the squares of the coordinates. This function name is short for “sum of squares”. Boyles. Daniel Sloan and Russell A.50 5.70 Standards of Evidence We used Microsoft Excel to create the visual presentation in Table 5.14 Normal distribution curve.92 6.92 million and a sample standard deviation of $0.34 All four data points lie within two standard deviations of the mean. -3s -2s -1s 0 +1s +2s +3s 5. Figure 8 The four annual totals from Table 4 and the corresponding 0. 2003 . Each vertical dotted line represents one standard deviation.

2003 . Larger values of F imply stronger evidence against the null hypothesis. The squared lengths of the vectors are also called “sums of squares. The idea is to see whether or not the evidence in the data is strong enough to discredit the null hypothesis. The null hypothesis for this analysis is the following statement: There is no significant trend in the annual totals. Daniel Sloan and Russell A. or F statistic. variation. As a result. It is called the F ratio. This is a Law of Universe. This is not a foregone conclusion.923 in this case. Boyles. Standards of Evidence 71 value are a result of natural. When we divide the profit signal variance by the noise variance we get a signal-to-noise ratio that measures the strength of evidence against the null hypothesis. Pythagorean Theorem). It is used in applied research all over the world. That leaves two degrees of freedom for the noise vector. Without this adjustment. or Chance. In this case F = 2. the variances would be biased. 5.843. can we say there is a significant trend in the annual totals. we divide the sums of squares by the degrees of freedom. All Rights Reserved. the profit signal vector has one degree of freedom. The coordinates of the profit signal always add up to zero. Then. Our second vector analysis addresses directly the validity of the bar graph trend line in Figure 7. because Ronald Fisher invented it.” The profit signal vector is equal to the best-fit line in Figure 7 minus the data average. and only then. so it is completely determined by the slope of the best-fit line.a.k. To get the profit signal and noise variances. These three vectors are related by the New Management Equation (a. © M. The variation vector is broken up into the sum of profit signal and noise vectors. It is a special kind of hypothesis. There is certainly no evidence of significant differences among these totals. The visual presentation of the analysis is shown in Table 6.

Boyles. This distribution depends on the degrees of freedom for the profit signal and noise vectors. Table 7 shows the monthly revenue numbers in data matrix format.” This is a This number doesn’t seem very large.234 in Table 6 is the probability of getting an F ratio as large as 2. The squared lengths of the vectors are also called “sum of squares.05. The p-value of 0. 2003 . which involves a sum of it relative to a statistical distribution representing chance squared numbers. Daniel Sloan and Russell A. But there is no standard reference to the New Management scale of comparison for the F ratio.05 but less than 0. © M. By established international standards. All Rights Reserved. the evidence against the null hypothesis is ‘clear and convincing’ if the p-value is less than 0. We present some smaller examples in Chapter 5.72 Standards of Evidence Table 6 Ilustration of the vector analysis for a linear trend in the four annual totals. This data set is too large to use as a tutorial. we reject the null hypothesis.15. If the p-value is greater than 0. There is no significant trend.843 by chance alone. The p-value in Table 6 does not meet even this lowest standard of evidence. Instead. we interpret Equation. there is a ‘preponderance of evidence’ against the null hypothesis. variation. If the p-value is small enough.

Daniel Sloan and Russell A. Standards of Evidence 73 Table 7 The monthly revenue numbers in data matrix format (thousands of dollars). 2003 . Boyles. © M. All Rights Reserved.

The manufacturing design. Stuart Hunter. In turns out these were the last three months before a change in the accounting procedures. ��������������������� ������� Figure 9 The monthly revenue numbers plotted in time sequence. is more costly than Material B. You can quickly see the differences between a typical spreadsheet analysis and vector analysis applied to a data matrix. a great deal can be learned simply by plotting the data in time sequence. They should have been omitted from the analysis. Data Analysis and Model Building by George Box. It doesn’t take a Statistician to see that there is no trend here. 2. Daniel Sloan and Russell A. This is done in Figure 9. cost and margin analogies are appropriate.20 It achieves the following objectives: 1.74 Standards of Evidence Meanwhile. This example uses the small data set presented in their book. A design team is arguing over the wear rates of shoe-sole materials A and B. the current specification. 2003 . The manager wants to go with Material B because it is cheaper. Material A. ������� ������� ������� ������� ������� ������� ������� � � � �� �� �� �� �� �� �� �� �� ������ The Game is Afoot Another example of a full vector analysis is the shoe-sole wear rate workshop in the classic 1978 text Statistics for Experimenters: An Introduction to Design. All Rights Reserved. William Hunter and J. The only features of note are the three low points at the beginning of the series. and his spreadsheet analysis shows there will be no significant loss of durability. Engineers © M. with an invented story line based on our consulting experiences. just random variation. Boyles.

Each boy wore one shoe made from Material A and one from Material B.) ����������� ����������� ��� ���� ��������� ���� ��������� ��� ���� ����� ����� ����� ����� ���� ���� ����� ���� ��� ����� ����� ���� ����� ������ ����� ����� ����� ����� ������� ����� ����� ���� ����� Figure 10 Wear rate data as arrayed ���� ���� ���� ����� ���� ��� ���� ���� ����� ���� in Excel. the manager concludes that the difference in durability is irrelevant. Standards of Evidence 75 are concerned that Material B is not sufficiently durable. as shown by the bar chart in Figure 11. By using �������������������� Figure 11 Wear rate data as analyzed by a spreadsheet bar graph. All Rights Reserved. there were a number of cases where Material A actually wore out faster than Material B! The manager is elated.41 units higher than for Material A. The average wear rate for Material B comes out 0. Coin tosses were used to randomly assign Material A to the left or right foot for each boy. ����� ����� ����� ����� ����� ������ ����� ���� ���� ���� ���� ���� ����� ����� ����� ��������������������� ������ ������ ����������������������� ����� ����� ���������� ���� ������������������ ��������������������������� ���������� ����� Ten boys were enlisted for the test. ����� ����� ���� ���������� ���� � � � � � ���������� ���� © M. Given the price difference between the two materials. an increase of 3. 2003 .86%. Daniel Sloan and Russell A. Boyles. (See Figure 10. Furthermore. Data has been collected and arrayed in a spreadsheet.

) Table 8 Vector analysis of the wear- rate data. The company will replace material A with the less costly. Daniel Sloan and Russell A. we recreate her vector analysis data in Excel. consensus is reached. (We timed both methods. As the meeting is wrapping up. to maintain good relationships. a Six Sigma Black Belt in training asks if she can analyze the data herself using a vector analysis applied to a data matrix. All Rights Reserved. a straw man to be pulled apart by evidence. Nevertheless. equally durable material B. This change will be worth millions to the bottom line. 2003 . She imports their Excel spreadsheet into her statistical package. People have places to go. rather than a foregone conclusion. For present purposes. After a long and difficult team meeting. they give her five minutes. The Excel reconstruction literally took 10 times longer than doing a correct vector analysis in the statistical package. The idea is to see whether or not the © M. the shoe manufacturer can increase profit margins and maintain product durability.” The trainee explains that this is a hypothesis. Boyles. � The Black Belt trainee starts her extemporaneous presentation by stating the null hypothesis for the analysis: “There is no difference between the average wear rates of the two materials. This is shown in Table 8.76 Standards of Evidence material B instead of A. things to do. It is getting late.

All Rights Reserved. “But let’s not jump to conclusions. the vector analysis (Table 8) breaks the vector of differences into the sum of the data average vector and the noise vector. the differences should be symmetrically distributed around zero. the trainee produces a frequency histogram of the differences (see Figure 12). Standards of Evidence 77 evidence in the data is strong enough to discredit the null hypothesis. This casts doubt on the null hypothesis—the wear rates for Material B are consistently higher than those for Material A. For analyzing matched pairs like we have here. “As you can see. If the null hypothesis were true. Pointing at the graph. the data average and the profit signal vector are one and the same. As you can see. all but two of the differences are positive. Also. 2003 . With three clicks of her mouse. ����� � ��� �� ��� � ���� vector analysis to establish the strength of this evidence. the average difference should be close to zero. She goes on to explain that we should be looking at the differences between A and B for each boy—that was the whole point of having each boy wear one shoe of each kind. Boyles. Daniel Sloan and Russell A. We need to complete the ������������ ������������� Figure 12 Frequency histogram of differences in wear rate (B minus A). © M. she says.

This means there is a significant difference between A and B. If the p-value is small enough. All Rights Reserved. We have to adjust the New Management Equation (a.k.” After some uncomfortable laughter.41. The vectors are 10- dimensional because there are 10 differences. the average difference of 0. Boyles.78 Standards of Evidence “As you can see. It is called the F ratio because a guy named Fisher a long time ago invented it. the evidence against the null hypothesis is ‘clear and convincing’ if the p-value is less than 0. “Let’s take a pause for just a moment here to do a little yoga stretching while our minds are bending. the Black Belt’s Six Sigma analysis continued. 2003 . We have to compare it to a distribution to see how likely it is that a value as large as 11.215. we have to reject the null hypothesis.” she said. sums of squares a. the profit signal vector and noise vector are related by The New Management Equation. and it is ‘beyond a reasonable doubt’ if the p-value is less that 0. “OK. squared lengths of vectors) by dividing by the degrees of freedom. the F ratio in this case is 11. “the p-value in this case is 0. As you can see.215 could have occurred by chance alone. We are back on task. We are way into hyperspace. so it has one degree of freedom.” Her presentation was interrupted by one of her friends. pointing at her computer screen.k. “This gives us Variances that measure the strength of the profit signal and noise vectors. Daniel Sloan and Russell A. “By established international standards.a.01 (Table 9). the lengths of the vector of differences. The profit signal vector is determined by one number. “The F ratio can’t be interpreted on its own. When we divide the profit signal variance by the noise variance we get a signal-to- noise ratio that measures the strength of evidence against the null hypothesis. beyond a reasonable doubt.a.0085. As you can see.” © M. That leaves nine degrees of freedom for the noise vector.05. This probability is called the p-value.

While teaching the real Analysis of Variance we often hear the comment. We were afraid yield losses would exceed the savings on material costs. The next Black Belt. our question is this. All Rights Reserved.21 Unless you and your loved ones have nothing better to do with the rest of your lives. Just another day in the life of a Six Sigma company. The waiting lists for the following sessions are long. Spreadsheet versus Data Matrix Spreadsheet arithmetic is today’s cost-accounting variance analysis computing engine. Even though the difference was less than 4%. we felt that a difference of 0. Boyles. “That makes a lot of sense. The company takes the next step forward by implementing Six Sigma across all projects and functional responsibilities in the corporate matrix. 2003 . One engineer says. Some of that work has been presented in this chapter. Their first- wave Black Belts are now in Master Black Belt training using their own case studies. Standards of Evidence 79 Table 9 The Black Belt showed the table of evidence to the team.” This is true.41 units could cause problems. “So what’s the big deal with a data matrix? You can do all that in a spreadsheet. There is more to come in Chapters 5 and 6. Daniel Sloan and Russell A. “Why would anyone want to?” © M. It is also true that you could eventually compute the orbital trajectories of all the planets in our solar system with an abacus. We know because we have done it. Yellow Belt and Champion courses are filled to capacity. The company’s reputation for quality is preserved. Green Belt.” A potential disaster is narrowly averted by using an evidence- based decision in the nick of time. Critical-to-quality characteristics and financial margins are protected.

although it does affect the results. The vector analysis can handle this. Other spreadsheet characteristics are simply inconvenient or annoying. (Hence the name. you can put your data wherever you want it and analyze it however you want. Spreadsheet applications are unruly and Lawless. The Laws of the Universe do not apply to them. even Analysis of Variance. By contrast. 6 and 8. We did in fact make Tables 5. no requirement for transparency. Like Keats. each column a vector of data on the objects.) Vector analysis provides the transparency required to satisfy international accounting standards and scientific standards of evidence. profit signal and noise vectors shown in Tables 5. The greater liability in trying to do everything with a spreadsheet stems from the very freedom that makes spreadsheets so popular. In reality. If you add in enough add-ins. These programs give you access to this machinery with a mouse click. You can write formulas. on the other hand. There is no requirement for vector analysis. For example. A missing value changes the degrees of freedom and dimension of the vector. Boyles. One can create this table in a spreadsheet. This works fine for adding and subtracting. the cavalier insertion of zeroes for missing values wreaks havoc on vector analysis. © M. but nothing forces other users to do so. follow the Law. a blank cell indicates a missing value in a data vector. 6 and 8 in a spreadsheet. It automates arithmetic. Daniel Sloan and Russell A. statistical packages automatically create the variation. For example. many spreadsheet functions treat blank cells as zeroes.80 Standards of Evidence The spreadsheet is a marvelous invention. All Rights Reserved. although it is tedious. They require the correct data matrix structure—each row an object of interest. giving incorrect results. you can actually do some statistics. The undemanding nature of spreadsheets lures unsuspecting users into sins of omission. 2003 . Statistical packages. These and related comments are summarized in Table 10. Data vectors are the principle components of vector analysis. Adding in the add-ins is a clumsy way of trying to reinvent the machinery of a vector analysis that already exists in modern statistical software.

but it is an odd standard. The phrasing of a null hypothesis is not a law of the universe. • There are no differences among these three or more ways of doing things. 2003 . Standards of Evidence 81 Table 10 Comparing and contrasting a spreadsheet and a data matrix. Profit Signals. It could be torn apart with no harm done. Boyles. P-values. Here are some examples: • There is no difference between these two ways of doing things. Daniel Sloan and Russell A. The null hypothesis often plays the role of a “straw man” in inductive reasoning. No such discipline exists in a spreadsheet. • There are no relationships between these two groups of variables. © M. the straw man concept began as a rodeo safety tactic. the null hypothesis. • There is no relationship between these two variables. All Rights Reserved. Confidence Levels and Standards of Evidence A null hypothesis always consists of a negative assertion. According to the on-line folklore database Wikipedia. • There are no relationships among these three or more variables. We can tear apart the straw man.22 A straw man would distract bulls. Inductive and deductive reasoning are built into data matrix software. if it is something we would like to disprove based on the data.

As a result. 1. We get around this by working with a probability computed from the F value.234. The formula to produce the p-value 0. the strength of evidence against the null hypothesis increases. is the probability of getting an F ratio as large as the value we got by chance alone. 1 is the number of degrees of freedom for the profit signal vector. © M.215. called the p-value. Boyles.82 Standards of Evidence The F ratio. the cell formula syntax for calculating the p-value is this: = FDIST(value of F ratio. or F statistic. we reject the null hypothesis. In Microsoft Excel. Enter this formula into your Excel spreadsheet and you will get the correct answer: 0. This probability. degrees of freedom for the profit signal vector. is a signal-to-noise ratio that measures the strength of evidence in the data against the null hypothesis. The distribution to which the F ratio is compared depends on the degrees of freedom for the profit-signal and noise vectors. We evaluate an F ratio by comparing it to a statistical distribution to see how likely it is that a value that large could have occurred by chance alone. 1. All Rights Reserved.215 is the value of the F ratio. If the p-value is small enough. 2003 . there is no standard scale of comparison for the F ratio. Enter this formula into your Excel spreadsheet and you will get the correct answer: 0. degrees of freedom for the noise vector) For example.234 in Table 6 is as follows: = FDIST(2. the formula to produce the p-value 0.843 is the value of the F ratio. 9) 11. 1 is the number of degrees of freedom for the profit signal vector.843. Daniel Sloan and Russell A. and 2 is the number of degrees of freedom for the noise vector. As the F ratio increases. 2) 2.0085.0085 in Table 8 is as follows: = FDIST(11. and 9 is the number of degrees of freedom for the noise vector.

Standards of Evidence 83
We do not like writing spreadsheet formulas. We do like the
fact that statistical software does it for us automatically.

As the F ratio increases, the p-value decreases. As the p-value
decreases, the strength of evidence against the null hypothesis
increases. This tends to confuse people. It is easier to think in
terms of confidence levels (Table 11). The confidence level
is one minus the p-value, usually expressed as a percentage. As
the confidence level increases, the strength of evidence against
the null hypothesis increases.

Table 11 Standards of evidence in
a nutshell. A p-value less than 0.05
yields a confidence level greater than
95%. A p-value less than 0.01 yields
a confidence level greater than 99%.

Closing Arguments

Themis is the Blind Lady of Justice in Greek mythology.

Themis: “As an oracle, I used to advise Zeus when he made
decisions. I did my job so well I became the goddess of divine
justice. You can see from some of my portraits that I used to
carry a sword in one hand and a set of scales in the other. The
blindfold I wore was more than a fashion statement. It meant
I would be fair and equitable in my judgments. My whole
existence hinges on objective standards of evidence.”23

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

84 Standards of Evidence
Endnotes

1
American Heritage Dictionary of the English Language, Third
Edition. Boston. Houghton Mifflin Company. 1992.

2
Dawkins, Richard. Unweaving the Rainbow, Science Delusion
and the Appetite for Wonder. Boston, Houghton Mifflin
Company, 1998.

3
Huff, Darrell and Geis, Irving. How to Lie with Statistics.
New York, W.W. Norton and Company. 1954.

4
Taylor, Frederick Winslow. Scientific Management, Mineola:
Dover Press, 1998. pages 55-59. The original 1911 version
was published by Harper and Brothers, New York and
London..

5
Oxford English Dictionary, 1989.

6
Garrison, Ray H. and Noreen, Eric W. Managerial
Accounting, 10th Edition. Boston, McGraw-Hill Irwin, 2003.
Page 431.

7
Harrison, G. Charter. Cost Accounting to Aid Production
– I. Application of Scientific Management Principles. Industrial
Management, The Engineering Magazine, Volume LVI, No.
4, October 1918.

8
Harrison, G. Charter. Cost Accounting to Aid Production
– I, Standards and Standard Costs, Industrial Management,
The Engineering Magazine, Volume LVI, No. 5, November,
1918.

9
Harrison, G. Charter. Cost Accounting to Aid Production
– I, The Universal Law System. Industrial Management, The
Engineering Magazine, Volume LVI, No. 6, December, 1918.

10
Johnson, H. Thomas, and Kaplan, Robert S. Relevance
Lost, The Rise and Fall of Management Accounting. Boston:
Harvard Business School Press 1991. Pages 10-12.

11
Anthony, Robert N., and Reece, James S., Accounting: Text
and Cases, Eighth Edition. Homewood, Irwin, 1989. Page 15.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Standards of Evidence 85
12
MacKay, Charles, Memoirs of Extraordinarily Popular
Delusions, Copyright 2002 eBookMall version available for
$1.75. http://www.ebookmall.com/alpha-authors/m-authors/
Charles-MacKay.htm

13
MacKay, Charles, Memoirs of Extraordinarily Popular
Delusions, Copyright 2002 eBookMall version available for
$1.75. http://www.ebookmall.com/alpha-authors/m-authors/
Charles-MacKay.htm Page 8.

Gardner, Martin. Fads and Fallacies in the Name of Science.
14

New York, Dover Press, 1957. Page 106.

15
Sagan, Carl. The Demon Haunted World, Science as a Candle
in the Dark. New York, Ballantine Books, 1996. Page 241.

16
Anthony, Robert N., and Reece, James S., Accounting: Text
and Cases, Eighth Edition. Homewood, Irwin, 1989. Page
941.

17
Oxford English Dictionary, 1989.

18
Box, Joan Fisher. R.A. Fisher: The Life of a Scientist. New
York: John Wiley and Sons, 1978. Page 97.

19
Box, Joan Fisher. R.A. Fisher: The Life of a Scientist. New
York: John Wiley and Sons, 1978. Page 100-102.

20
Box, George E.P., Hunter, William G., and Hunter, J.
Stuart. Statistics for Experimenters, An Introduction to Design,
Data Analysis, and Model Building. John Wiley & Sons. New
York. 1978.

21
Dilson, Jesse. The Abacus, The World’s First Computing
System: Where it Comes From, How it Works, and How to Use it
to Perform Mathematical Feats, Large and Small. New York, St.
Marten’s Press. 1968.

22
http://www.wikipedia.org/wiki/Straw_man

23
http://www.commonlaw.com/Justice.html

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

86 Standards of Evidence

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Chapter 3

Evidence-based
Six Sigma

S
ix Sigma (6σ) is a proven, pursuit-of-perfection business
initiative that creates breakthroughs in profitability,
productivity, and quality. It is a highly structured,
project-by-project way to generate bottom line results. It
produces significant dollar value through a never-ending
series of breakthrough projects. Evidence-based decisions
characterize the 18-year, 6σ record of accomplishment.

The essential elements of Six Sigma breakthrough projects are
vector analyses applied to data matrices.

Hundreds of millions of dollars have been placed directly
onto the bottom line of companies around the world using
this improvement model and its tool set. Though large
multi-national corporate results have attracted the most
media attention, we have personally seen a 26-employee
plastic pressure and vacuum forming company achieve
proportionally identical results.

Six Sigma knowledge and know-how have evolved since
the notion of perfect 6σ quality was first conceived by
Motorola engineer Bill Smith. Motorola’s Chief Executive
Officer at the time, Robert Galvin, was the first Six Sigma
Champion. He enthusiastically led the entire program. He
personally removed bureaucratic obstacles to breakthrough
improvements.

Six Sigma became an education and training commodity
during in the late 1990’s. It gains momentum as it matures.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

or ANOVA. Six Sigma measurements are recorded in data matrices. Corporate executives embrace them even though only a few know what the phrase and acronym mean. This analytic process is sometimes called an Analysis of Variance. Every Six Sigma champion executive and. Customer satisfaction. calling them Six Sigma Tools has worked wonders. Two are priceless business intelligence commodities: 1) Profit Signals and 2) Noise. A Six Sigma analysis is a vector analysis applied to a data matrix. The mass market of Six Sigma calls for better branding. Six Sigma gets its name from the vector analysis results. Since data matrix applications are essential to vector analysis. Boyles. That is a remarkable accomplishment in anyone’s marketing book of records. © M. the principles of accelerated adult learning and hands- on improvement projects.” That appealed to engineers and statisticians. every true 6σ company has its own corporate software standards. Though many products are available. What is valued gets measured. Daniel Sloan and Russell A. two currently dominate the market: Minitab and JMP. We answered that call. Profit Signals have been called “treatment deviations. speed. Six Sigma also conveys substance. analyzed and is rewarded. breakthroughs routinely lead to quantum leaps in profitability. if she or he expects to be promoted. Since the acronym and its equations are traditionally presented in ways that are guaranteed to bore even motivated academics.” Anyone who wants to correctly analyze measurement data can now do so in seconds. and lean organizational structures are Six Sigma cultural values. All Rights Reserved. every manager in a Six Sigma company has data matrix software loaded on their personal computers. Wall Street likes 6σ because it ties customer satisfaction directly to corporate profitability.88 Evidence-based Six Sigma The catchy three syllable “Six Sigma” moniker is value-added packaging for vector analysis and objective evidence. When a company combines computing power. 2003 . quality information. An ANOVA breaks raw data into six vectors (Figure 1).1) Computing power transforms what was once an almost impossibly difficult series of matrix algebra calculations into a single computer command “Run Model. (Historically. As we graphically detailed in Chapter 2.

The jargon side of this business initiative is as real as it is regrettable. Top-level executives personally lead the Six Sigma initiative in highly visible ways.2 Computing literacy. 2. Acronyms and algebraic symbols are Six Sigma grammar. is an expected competency for every leader. rule driven analyses of financial and productivity data are evident in Six Sigma executive presentations. The litmus test of leadership is the replication of high dollar value breakthrough projects. Evidence-based Six Sigma 89 Figure 1 A complete analysis is composed of six vectors. Education and skill training in the recognized body of knowledge (BOK) permeate Six Sigma organizations. Executive compensation and promotion are tied to the use of data-driven. Six Sigma (6σ) Basics Here is the bullet list of Six Sigma basics. with fewer © M. Correct. 3. Authentic 6σ executives eschew the use of spreadsheet bar graphs and pie charts. Exponential rates of improvement are an expected outcome. We identify these hieroglyphics as a courtesy orientation to newcomers. Profit Signals quantify what matters most. 1. New ways of getting work done. All Rights Reserved. the Six Sigma initiative will fail to produce promised results. Daniel Sloan and Russell A. which means decision makers know how to use a vector analysis applied to a data matrix. evidence-based decisions. 2003 . Boyles. If an executive champion does not meet the challenge of these responsibilities.

90 Evidence-based Six Sigma
resources, and in a fraction of the time required by
previous methods, take precedence over incremental
process improvements.

4. Measurements and Six Sigma metrics are tied to short-
term and long-term financial performance.

Executive Six Sigma leaders allocate significant personal
time and resources for 6σ projects. In addition to their
own investments, they assign the company’s most capable
people full-time to lead Six Sigma breakthrough projects.
The Executive’s job is to remove bureaucratic roadblocks to
improvement so that managers who have an aptitude for
implementing productive changes can succeed.

The corporate Six Sigma job description hierarchy resembles
titles earned in a martial arts dojo. Full-time Six Sigma
professionals, called Black Belts, are expected to be able to
“kick the heck out of ” any variation that leads to waste or
rework.3 In addition to a Karate/Tai Kwan Do/Kung Fu/
Judo level of intellectual aggressiveness, Black Belts must
demonstrate leadership and good interpersonal skills. They
must be masters of evidence-based decision principles.

Ideally, sensei executive champions coach and mentor 9th
degree Master Black Belts, who in turn coach, mentor and
lead Black Belts. Black Belts then coach and supervise Green
Belts and Yellow Belts. Education and training permeate the
organization. Eventually every employee actively contributes
to the production of breakthrough project results: cold cash to
the bottom line.

The Six Sigma Profit Strategy

Six Sigma improves profits by aiming at perfect products,
services, and processes. In a 6σ culture, everyone is expected
to enthusiastically argue in favor of perfection. A passionate
work ethic attitude carries weight in a Six Sigma culture.
Protests over the possibility of a “diminishing rate of return”
indicate an individual does not understand 6σ fundamentals.

The lower case Greek letter, σ, is pronounced ‘sigma.’ In
the professional world, σ is the symbol for the population

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Evidence-based Six Sigma 91
standard deviation. The sample standard deviation, along with
the five other elements in a complete vector analysis, comes
from raw data. It quantifies the amount of random or chance
variation that occurs around the average in any, and every,
given set of data. To understand and embrace the universal
Generalization of Chance Variation is to enter the world of
Six Sigma. Try the following experiment to demonstrate this
physical law for yourself.

First, find a friend you admire. Choose someone with whom
you can discuss controversial information. Now, each of you
needs to print the letter “a” 10 times on a piece of paper in
the exact same way with no variation.4 Go on. Try it.

This exercise is a trick. The task is completely impossible.
Differences in writing tools, variations in ink, paper texture,
handedness, fatigue, font, attention span, concentration, your
interpretation of our instructions, and an infinite number
of other variables all contribute to natural variation. Natural
variation is present everywhere and always. It is ubiquitous.
It is a law of our universe, as powerful as gravity. Every good
product and every service suffers from the inconsistencies
caused by variation.

J. Bernard Cohen, the eminent historian, considers knowledge
of Chance and/or statistical variation to be the distinguishing
characteristic of our generation’s Scientific Revolution.
“If I had to choose a single intellectual characteristic that
would apply to the contribution of Maxwell [though not
directly to his revolutionary field theory], Einstein [but not
the revolution of relativity], quantum mechanics and also
genetics, that feature would be probability.”5 We agree.

This Six Sigma Revolution in business and science is
defined by evidence that is based on Probability rather than
determinism.6 Like it or not, probability overthrows old
doctrine. There is no polite way to summarize the impact
variation has on an individual’s world view. Probability,
dressed up in the Six Sigma costume, is replacing old ways
of knowing—revelation, intuition, and reason—with the
disciplined analysis of experimental observations.

Six Sigma unifies the scientific method and business.
Evidence-based decisions and the power in a vector analysis
are the router connections between the two disciplines. In
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

92 Evidence-based Six Sigma
answer to the meta-questions, “Does this Six Sigma stuff
really work?” and, “Can you prove it by replicating your
results?” The answer is unequivocally, “You bet.”

With any and every set of raw data we can construct a
tetrahedron, the cornerstone of statistical evidence. When a
standard deviation is combined with an average, we can make
valuable predictions based on a family of probability curves
and surfaces (Figure 2). When one knows the average and
standard deviation (σ) of a process, one can improve that
process to near perfect, 6σ, performance. Perfect quality first
time every time is valuable. This value can be measured with
money.

Figure 2 Data matrix software
automatically transforms the
cornerstone of evidence into
probability distributions.

Figure 3 illustrates old school 1980s corporate Quality
Improvement (QI) aims. Way back then, ‘three-sigma’ quality
was the target.7, 8 This means that the 6σ total process spread
just fits between the lower and upper specification limits
(LSL and USL). At best, this means that 99.7% of process
outcomes satisfy customer requirements. This near 100%
quality sounds better than it is. Recall the unacceptably
wide variation in the prior chapter’s bar chart bamboozling
comparison. At its best, a three-sigma 99.7% distribution
promises ‘only’ 2,700 defective outcomes per million
produced.

A three sigma process may actually produce as many as
67,000 mistakes or defects per million (DPM). This is
because processes typically drift by about 1.5 standard
deviations around their long term average.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Evidence-based Six Sigma 93

Figure 3 Three-sigma quality
means that the 6σ total process
spread just fits between the lower
and upper specification limits (LSL
and USL). At best this means that
99.7% of process outcomes satisfy
customer requirements.

To put these numbers into perspective, ‘three-sigma’ aviation
safety would mean several airline crashes each week. In health
care, it would mean 15,000 dropped newborn babies per year.
Banks would lose thousands of checks daily. As it is, three
sigma (3σ) quality costs businesses between 25-40% of their
annual operating income in waste and rework.

Six Sigma breakthrough projects aim to reduce the standard
deviation. High-leverage processes that affect business,
manufacturing, or health care delivery are the prime targets.
The Six Sigma bell curve in Figure 4 covers only one half
of specification range. This illustrates the effect of a smaller
standard deviation, σ.

The Six Sigma one-part-per-billion (PPB) Six Simga bell
curve in Figure 4 covers only one-half of the specification

Figure 4 A Six Sigma capable
distribution covers only one half
of the specification range.

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

94 Evidence-based Six Sigma
range. This illustrates the dramatic financial benefit of
reducing the standard deviation.

Even when the process drifts, only 3-4 defective outcomes
per million (DPM), can occur. In a σ =$1.00 example, a Six
Sigma breakthrough would result in a standard deviation
that equaled $0.50 or less. When this goal of perfection is
achieved, costs related to waste, rework, inelegant designs, and
needless complexity disappear.

The proven rewards for achieving 6σ are, 1) excited customers
and 2) improved profits. Historically, each Six Sigma project
generates a 100-250K benefit. Full-time corporate 6σ
Experts, Black Belts who currently earn about 120K in salary
and benefits, lead three to four projects per year that generate
$1 million in hard dollar, bottom line business benefit. This
10:1 rate of return is so dependable it has become a tradition.

Prior to the development of Six Sigma in the late 1980s, the
only people earning their livings full time using these tools
for breakthrough projects were consultants. We were the only
ones willing to study out-of-date textbooks, use handheld
calculators, rulers, graph paper, and DOS programs.

Thank heavens those days are behind all of us now. Anyone
and everyone can enjoy the benefits of vector analysis applied
to a data matrix. Six Sigma style profits are now a matter of
personal choice.

The Lucrative Project Results Map

Flow diagrams and process maps simplify work. They make
hidden process dynamics visible. Seeing waste and complexity
helps people eliminate both. Flow diagrams like Figure
5 can also be used to create processes that produce perfect
results. To read the diagram, begin with the hard copy
documentation symbol at the upper left hand corner. Follow
the arrows through each of the four levels to the right hand
page bottom.

The acronym used to describe the classic 6σ process is
DMAIC. DMAIC stands for the iterative 6σ project cycle
of Define, Measure, Analyze, Improve, and Control. Once a
project is completed, the process described by this map begins
again. This cycle never ends.
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

������������������������������������������� ���������� ��������� ������������ ��������� ���������� �������������� ���������� ������������ ������������� ���� ��������� ��������������� �������� ��������� ������������ ��������������� �������������� ��������� �������������� ���������������� ������� � ��������� ������������ ����� ������������ ������� ����������� ����������� ������������ ������ ������������� �������� �������� ��������� ���������� �������������� ������ ����� ����������� ���������� ����� ����������� ������������ �������� ���� ������� ������������� �������� ����������� �������� �������� ���������� ��������� ������ ��������� �������� ��������� ������������������ ����������� ����� ������ ����������� ����������� ����������� ������������� �������� ����������������� ��������� ������ ��������� �������� ���������� ������������ �������� �������������� ������� ��������� ������������� ������������� ������ ������������ ������������������������� ��������� ���������������������� ������� ������������������ �������� ������������� ��������������� ������� © M. Daniel Sloan and Russell A. Evidence-based Six Sigma 95 Figure 5 This flow chart has guided projects toward bottom line business results for years. Boyles. All Rights Reserved. 2003 .

Don’t laugh. Results interpretation. All Rights Reserved. improvement and control require close collaboration between top-level leaders and Black Belts. Evidence- based decisions and Six Sigma will bring them nothing but trouble. © M. 2003 . and implementing improvements can and does flatten bureaucracy. so every 6σ project result needs to be substantial and financial. Daniel Sloan and Russell A. Employees will openly question executives. The map marks the boundary of each phase. As 6σ breakthroughs help companies surpass quarterly and annual financial targets. Measure. The process of interpreting statistical results. making an evidence- based decision.” In companies with a full commitment to evidence-based decisions. The series of five steps in the top row and the two final steps in the bottom row are top-level management and leadership responsibilities. long-term objectives are continuously upgraded to sustain momentum. customer satisfaction and profit goals come first and last in the Six Sigma DMAIC cycle of improvement. Boyles. This commitment is the key to perpetual breakthrough project success at the highest levels of the company. friendly people. committees. Improve.96 Evidence-based Six Sigma Define. Occasionally organizations that value bureaucracy manage to “do Six Sigma” while they find ways to sustain paperwork. Senior management processes and decisions are off limits. These folks just happen to draw an interesting set of Six Sigma project boundaries. optimizing a system. Analyze. there is broad-based organizational involvement. and supervisory redundancy. Each of these steps takes time. Cost- accounting reports and risky capital investment Proformas will be challenged with physical models. Six Sigma programs are seen as disruptive when a business values group think. Six Sigma window dressing is immediately apparent to any knowledgeable observer. Control The voice of the customer (VOC). We advise potential clients who are fond of their bureaucracies to stick with Old School Management methods. “Don’t go there. Many do. The middle three levels are Black Belt project tasks. The ones we have worked with and for are populated with delightful.

is still useful and very much in vogue. ephemeral means “dead in a day. the project management chart developed by Henry L. Make no mistake. The successful disruption of projects generally returns the culture to less demanding performance standards. For example. Nevertheless it was informative to watch a Black Belt compete. Evidence-based Six Sigma 97 Six Sigma employs just about every effective management tool that has ever been developed.9 A PERT (Evaluation and Review Technique) chart. project delays and passive criticism are favored benign neglect techniques. All Rights Reserved. organizational commitment to 6σ instantly wanes. © M. Gantt in 1917. One day. Though it is management’s responsibility to keep the improvement fires burning. We saw a most eloquent occurrence of this phenomenon in a CEO’s behavior. ROI. called a Gantt chart. Any project management tool you can think of that has proven to be useful is now called a Six Sigma Tool. The dollar per dollar return on investment.” As a side note. Experience shows that if a Six Sigma project improvement team fails to deliver bottom line business dollar value within this time frame. Boyles. In actual practice Six Sigma focuses relentlessly on completing projects within 90-120 days. They do what it takes to bring home the bacon. After a few months of Six Sigma hoopla. which provides an alternate Gantt chart view of a project. “Six Sigma is ephemeral. Projects were not being completed on time. is also popular. people noticed that he wasn’t using evidence unless it supported the foregone corporate agenda. Old school managers can and do successfully use neglect to sabotage Six Sigma. to his dismay. Resistance to evidence-based decisions grew. particularly an experienced Master Black Belt. The Institution of Old School Management thinking does not surrender until it is surrounded and expelled. it was interesting to see this Six Sigma initiative generate about $6 million in bottom line benefits by the year’s end. was only 5:1. 2003 . The Six Sigma field is littered with the corpses of failed Black Belt Projects.” The VP looked the word up and discovered. Daniel Sloan and Russell A. he casually observed to the vice president in charge of implementing Six Sigma.

Operational definitions must be practical. If senior management.10 ���������� � � � � � � � � �� � ����� �������������� � � � � � � � � � � ���� ����������� � � � � � � � � � � ����� ��������������������������� � � � � � � � � � � ���� ������������� � � � � � � � � � � ���� Lucrative Project Selection Selecting and prioritizing the most rewarding projects is a most important first step.98 Evidence-based Six Sigma Therefore. we strongly recommend that once a company commits to evidence-based decisions. Daniel Sloan and Russell A. This project prioritization process promotes a consensus style agreement that has some quantitative structure to it. The hopper is always open. 2003 . All Rights Reserved. or the Black Belts who are assigned to the projects do not share a common understanding of these definitions. © M. Six Sigma team leaders put breakthrough improvement project ideas into this hopper. 1-10. The project and its related issues must be defined operationally. based on experience. the project champion. in five or more categories. each idea is ranked from low to high. problems arise. Clear operational definitions are a fundamental part of project selection. Since time is money and money is time. new projects are usually given serious review at quarterly and annual intervals. Boyles. � �������� ���������� ������������ ���������� ������������� ����� ����������������� �������� �������� ����� ���������� ������� ���������� �������� �������������������������� Table 1 You can program a �������������������� � � � � � � � � � � ����� spreadsheet to help you choose ��������� � � � � � � � � �� � ����� best projects. These values are multiplied to create a priority rating. virtually universal project evaluation spreadsheet that has emerged as a favorite around the United States. it stay focused on the money and deadlines. During project review meetings. but depending on the culture. Table 1 is a simple. The suggested project with the highest total project priority number is first and so on. the selection process must be efficient and fast. We have seen it improve interpersonal working relationships as it generates lists of breakthrough project targets.

perfect Six Sigma performance expectations called Critical to Quality (CTQ) or Key Quality Characteristics (KQC) are defined in statistical terms. Both come from a vector analysis applied to a data matrix. the misinterpretation of a three-letter word has been known to derail projects. Here is another classic example that illustrates why clear operational definitions are crucially important to even a simple process like counting. could be considered to be correct. or all of these answers. So are at least 20 others. Evidence-based Six Sigma 99 For example. a depression in the earth. it is considered to be part of a comprehensive operational © M. There are no lower case. if you defined an F by the way it sounds you could have counted 1. Boyles. You can pan a camera or pan for gold. 2. italicized f ’s. Believe it or not. Count the number of f’s in the following paragraph. 5. Daniel Sloan and Russell A. pan means bread. or 6. So. write down your definition of the word ‘pan. the F in each OF sounds like a “v. and the Greek god of the woods. Do more of what works. If you decided to count any F. All Rights Reserved. there can be no objective evidence. 2003 . Pan is a cooking container. 3. Your definition is correct. Any one. If you proof read phonetically. in other words you defined “f ” by the sound of the letter. FOR CENTURIES IMPORTANT PROJECTS HAVE BEEN DEFERRED BY WEEKS OF INDECISION AND MONTHS OF STUDY AND YEARS OF FORMAL DEBATE. In Spanish. 4. How many did you count? Pause to write your answer here before moving on. zero is one correct answer. experienced Six Sigma Master Black Belts and Black Belts are very specific when they define what it is that they intend to count or measure. Without a statistical definition. there are 6. Since the Profit Signal is also automatically produced by this analysis process. taken in the context of its definition. During the project selection phase. For this very good reason.” So.’ Good work. _______ Depending on how you decided to define the letter “f ” there are seven possible correct answers. a cavity in the lock of a flintlock. The definition must include an average and a standard deviation.

A reliable forecast is as transparent as an authentic analysis. or forecast. Daniel Sloan and Russell A. All data and all elements are revealed. Charnes exemplifies leadership in the field. Legitimate financial forecast models are created using vector analysis rules. © M. These figures are accompanied by the expected dollar value of benefits the company can look forward to harvesting. Financial Modeling and Simulation Six Sigma budget models are dramatically different from. Financial Engineering News is one of many trade publications that helps professionals get up to speed on the use of these tools. All Rights Reserved. This is a covert impropriety if there ever was one. John M. By using only one vector. spreadsheet arithmetic Proformas. The old school cost-accounting variance analysis encourages confabulation by eliminating 5 vectors. Six Sigma tools raise the standards of what does and does not constitute a credible Proforma. When projects have been identified and Key Quality Characteristics are defined. of all the information contained in raw data. scenario. These high analytic standards are used at all levels in the organization.100 Evidence-based Six Sigma definition. and superior to. and Decision Science at the University of Kansas. or 83 percent. When correctly employed. almost any story holds water. As the Area Director for Finance Economics. an average and a desirable standard deviation for project outcomes are targeted. Charnes is a frequent contributor. and masking the other five vectors. 2003 . Boyles. He used the Decisioneering product called Crystal Ball. Once operational definitions are agreed to. Every manager who has actually participated in the old-school ritual called “spreadsheet scenarios” must candidly admit to making the numbers up. An abacus cannot beat a super-computer no matter how fast one’s fingers are. financial models are used to create credible bottom line profit signal estimates. School of Business. to create a 16 module self-guided study course that we think is excellent.11 Dr. Dr. Increasingly accurate predictions put the world of continuous spreadsheet revisions to shame.

new designs. Evidence-based Six Sigma 101 Figure 6 One popular Six Sigma software program uses flow diagrams to graphically detail the iterative cycle used to create and improve financial forecasts. Surgeons had to test new techniques on live patients. Boyles. Pilots had to practice first solo flights at 600 miles per hour. flying skills. The output is presented graphically. and even freeway entrances can be tested “off-line” first to minimize risk. jet pilots. In a data matrix driven budget forecast. surgeries. engineers had to build expensive physical models to test their ideas. The model can then be simulated thousands of times in seconds. 2003 .12 © M. and student drivers. All Rights Reserved. Simulation is proving to be as beneficial to financial managers as it is to engineers. His open system flow diagram. Parents had to take their teenager into traffic and hope for the best. is shown in Figure 6. correlations. each has a data matrix and vector analysis for sparkplugs. Working under old school constraints. 3 and more dimensions prior to including that assumption in the forecast model. Daniel Sloan and Russell A.” If you look closely under the hood of reputable simulation applications. This is why computerized simulation is a “Six Sigma Tool. the historical data underlying each budget assumption are graphed in 2. The benefits to simulation are objective and overwhelming. and entrepreneurial assumptions are created. doctors. With multi-dimensional computer simulations. multivariate models incorporating factor interactions. with clouds representing thought processes. Once assumptions are validated.

and an analytic graph—people must guess at the number’s meaning in relation to the other variables in a system. Daniel Sloan and Russell A. They let leaders meet and beat breakthrough project goals. inventive software manufacturers began to develop programs that forced spreadsheets to behave like a data matrix. multivariate systems. With the finance simulation tool add-in. With a spreadsheet. Her counsel was. 2. and remains. Boyles. Every day we thank the General Electric Senior Vice President who took time out of her day in 1997 for a cold call telephone interview. They are a joy to use. Analysis rules. 1. They are affordable tests. rock solid. © M. Spreadsheets are wildly popular because they let anybody do absolutely anything with any number. quantification. Vector models do an impressive job of helping decision makers visualize probable outcomes. This error is serious. Many now are convinced simple addition. budget forecasts inherit the power of a vector analysis. a. She explained how these programs push Six Sigma forward. subtraction. It all looks legitimate! Though spreadsheet arithmetic sets the standards of evidence at a comfortably low level. Because the columns and rows in a spreadsheet look just like the columns and rows in data matrix. Without analysis context—an average. Beyond the covert elimination of 5 analysis vectors. they produce illusion rather than insight. a standard deviation.102 Evidence-based Six Sigma Beginning in the late 1980s. These macros are now mature modeling programs. Spreadsheets encourage analysts to believe in the great bamboozle. The geometry that guided their design creates graphic results that look terrific. multiplication and division are appropriate tools for analyzing complex. an individual number in a cell is accepted on face value. 2003 . All Rights Reserved. This number frequently misleads because it is not framed in a meaningful context. continuous feedback and discipline improve model forecasts over time. spreadsheet arithmetic budget models and “what-if ” scenarios fall short of evidence-based decision standards in significant ways. probability information. many conclude equivalent answers are automatically produced with each one.

which at first glance is so intricately put together that it is difficult to believe it is nothing more than the product of a man’s brain… Consciously or unconsciously. they waste time. well-meaning people to forget what they know about mathematics. OFAT analysis and experimentation methods are no more reliable now than they were when Frederick Taylor used them in the 26 years leading up to 1911. These images shout out. but have no basis in the exterior world. By over-simplifying problems. “Hokum!” Nevertheless they are routinely presented. People are persistent when it comes to juggling numbers. managers can perform tens of thousands of multivariate scenarios in minutes. answers are notoriously unreliable. and framed as “certain forward thinking statements” in a social gesture of courtesy that smacks of hubris. Spreadsheet scenarios are usually created using One- Factor-at-a-Time (OFAT) methods. Spreadsheet scenarios create a false impression of precision. Once a 3D cube model is embedded in a spreadsheet. 2003 .14 This is a very good thing. Spreadsheet numbers are dressed up in impressive looking data arrays. In corporate hierarchies these courtesies force otherwise intelligent. analysts have a much better grasp on the range of possible budget outcomes. This is less time than it takes a skilled controller to © M. With simulation. Likelihoods and probabilities are presented automatically in attractive visual graphs. Daniel Sloan and Russell A. Boyles. All Rights Reserved. Martin Gardner’s Fads and Fallacies observation rings as true today as it did when he wrote it in 1952. “How easy it is to work over an undigested mass of data and emerge with a pattern. Not only do they not yield an accurate answer.” 13 Simulation programs give spreadsheets like Excel a new lease on life. their perceived dogmas twist and mold the objective facts into forms which support the dogmas. Conclusions reached using this method are at odds with physical Laws of the Universe. accepted. Macros that follow the rules of evidence are bringing high standards to the world of accounting and finance. 4. Human nature is tireless in its allegiance to irrational beliefs. Evidence-based Six Sigma 103 3.

Daniel Sloan and Russell A. 2003 . All Rights Reserved. factors are ranked in importance according to the relative strength of statistical evidence (Figure 8). However. Boyles. © M. rather than being distracted by variables they think may be most important. One-Factor-At-a-Time (OFAT) budget forecast scenario.000 times in under six seconds using the data matrix introduced in the Five-Minute PhD. Vector analysis sensitivity charts rank Profit Signals according to the strength of the evidence for each factor. they are a sure fire new product. the simulation automatically produces a sensitivity chart that resembles a spreadsheet bar graph. personal opinion and a consensus are the only evidence required for making a decision on the decision to pursue the NanoTech Widget. the analyst or manager can evaluate which of the variables has the greatest impact on the bottom line. Once the simulation is complete. They are multi-purpose. Sensitivity charts expose counter-intuitive patterns that are masked by spreadsheet arithmetic. and for no extra charge. it does point out that the most probable outcome is a $14.2 million gain. a much different picture emerged. NanoTech Widgets solve problems.4 million loss rather than a $9. 2) worst case. In addition. With a projected profit of $9.104 Evidence-based Six Sigma complete a single. Predictably. and 3) most likely case.2 million. Simulation can increase one’s level of confidence as business decisions are made in the face of uncertainty. A sensitivity analysis ensures that management focuses on the key variables that have the most impact. Note how the forecast in the bottom right hand cell catches the eye.15 Since there are no rules. Figure 7 does not tell the manager what to do. Figure 5 is a sensitivity chart illustration. Compare and Contrast Analysis The classic budget forecast (Table 2) is usually created by estimating three outcomes: 1) best case.16 When this spreadsheet was analyzed 1.

Evidence-based Six Sigma 105 Table 2 The classic old school budget forecast for new product development presents assumption numbers without the benefit of either context or evidence.2 MM. and analytic graphs are ignored. The average. There is only about a 50/50 chance of making the projected $9. standard deviation. Simulations and legitimate financial forecasts are standards in Six Sigma breakthrough projects. Figure 7 Based on all the actual data at hand. p-value. Boyles.4 million dollar loss highlighted at the left side of the forecast. All Rights Reserved. Daniel Sloan and Russell A. © M. Forecasting spreadsheet analysts are simply expected to correctly guess all values. The level of thoughtfulness this tool creates is well worth the time investment required. there is a 77. evidence strength.9% chance of breaking even with the NanoTech Widget. 2003 . We have seen simulations effectively tackle budgets with up to 77 variables. and factor interactions. The most probable outcome is the $14.

Since maps have proven their value they too are a “Six Sigma Tool. Boyles. Though nested boxes and Russian dolls are not official Six Sigma tools. So it is with Six Sigma process maps. 2003 .106 Evidence-based Six Sigma Figure 8 Success in launching with the NanoTech Widget product depends on the company’s ability to penetrate the market. Daniel Sloan and Russell A.” It is no accident that a process maps are the first-choice tool taken down from the shelf after a lucrative new project has been selected. these analogies encourage people to look more deeply than a surface appearance. Each three-dimensional replica must be produced using fewer resources and a higher degree of precision. Succeeding generations are contained within her. A good process map is as multidimensional as a set of nested Chinese Boxes or Russian dolls. are universal communication tools. © M. The outermost Russian doll is called a Matreshka or grandmother. from Babylonian clay tablet cartography to downloadable Internet driving directions. Process Maps Maps. All Rights Reserved.17 Miniaturized generations are refined replicas.

Black Belts are expected to map the nested dimensions of a work process in about a week. This is one of the skills that is worth the practice time investment. It is assumed that this system has feedback loops throughout. Boyles. geometry. These maps are impressive. Evidence-based Six Sigma 107 First blush drawings can span pages. these are simplified and distilled into diagrams that illustrate and endorse only essential elements that pull the system forward efficiently. The ‘Six Sigma Matreshka’ in Figure 9 is a Suppliers. Processes. Practice makes perfect. Daniel Sloan and Russell A. 2003 . In the same way that the Space Shuttle Radar Topology Mission used vectors. Drawing these maps from the end to the beginning is the best way to produce a meaningful SIPOC map showing all the relationships. and Customers map or SIPOC for short. and computing power to map 80 percent of the earth’s landmass in only 10 day’s time. These loops are not illustrated here in order to present a clean. Figure 9 Black Belts use personal interviews. Over time. ��������� ������ ��������� ������� ��������� ������� ��������� �������� �������������������� ���������� ����������� ��������������� ����������� ������ �� ���� ���� ������� ���������� ������������ ����� ���� ������� ������� ������������ ���������� ������������� ����������� ������ �������� �������� �������� �������� © M. Inputs. and measurements to complete this map. simple picture. Outputs. All Rights Reserved. first hand observations.

������� ����� ��������� ��������� �������� ���������� � ������� ��� ������� ���������� ��������� ������� Figure 10 The hidden factory of rework in this map includes Processes ����� 4-6 and the related delay. Suffice it to say lean maps document the entire value stream. low- tech way using paper and pencils.” Its appearance has a Charlie Brown and Lucy Van Pelt quality to it. This Six Sigma drawing describes the “hidden-factory. cycle time (C/T). the hidden factory always makes an appearance. Management responsibilities are visible and a host of snapshot measurements are recorded. its own literature. All Rights Reserved. bottleneck or constraint slows process flow.18 Lean is a separate business tool that deserves. The most sophisticated ones. Lean maps have a lexicon and icon system that is worth studying. 2) a delay. from the boardroom to the individual work space on the factory floor. or 3) a barrier stops production altogether. Mapping is a documentation discipline that is rewarding and informative. working time minus breaks. 2003 . Just as the start of every football season is marked by Lucy tricking Charlie into trusting her for the inevitable betrayal. Daniel Sloan and Russell A. Hidden factory maps are often posted in conspicuous places. every Six Sigma map uncovers Figure 10. They track information and material flows from start to finish through an organization.108 Evidence-based Six Sigma Invariably. They include uptime. are drawn in an old fashioned. and has. �� ��������� ��������� ��������� ��������� ����������������� ������������ �������������������� �������� ��������� ����������� ��������� ���������� ��������� ��������� Waste and rework plague every production and service delivery process. Boyles. Lean metrics make sense. called lean process maps. changeover time (C/ © M. They always happen where: 1) a loop reverses the forward motion of the product.

With the lean Six Sigma strategy in place a second can be. or the costs of waste and rework are called the Costs of Poor Quality (COPQ) Since the 1950s.19 Like a vector analysis. Rightfully so. © M. or the impact of variation on measurement. each time has an exceptionally specific operational definition. A thought map. and the scrap rate. in one San Jose Internet router factory a 2 foot by 2 foot by2 foot pile of scraped motherboards was time-valued at more than $6 million dollars. The ‘time- is-money-money-is-time’ theme dominates lean thinking. a plan for every part (PFEP). a vector analysis will point out strong and weak Profit Signals with objective standards of evidence. To do so would be as foolish as arguing against the speed of light. 2003 . inventory turns. the existence of gravity. Takt time. value added time (VA). literally worth thousands of dollars. the numbers of operators. breakthrough projects have focused on eliminating these expenses. the dollar value of a process outcome. First-In-First-Out (FIFO). are called Y’s. Boyles. the number of product or service variations. known as the X’s. lean maps record production batch sizes for every product interval (EPE). those who are familiar with lean tools do not argue against them. This is why lean maps are a “Six Sigma Tool. For example. Lean measurements and flow mapping earned their way into Six Sigma the old fashioned way. Figure 11. Evidence-based Six Sigma 109 O). minutes. Times are recorded in days. and production lead time (PLT). All Rights Reserved. hours. shows how these factors become a series of hypotheses in a data matrix. In addition to its own acronym. These maps help people identify process factors. that may be driving the system toward profits or losses.” Their record of extremely profitable achievements began in the 1950s Toyota production system and continues to this day. In addition. and seconds. and often is. Daniel Sloan and Russell A. Profits and losses. “Hidden Factory” costs. They work. Once the matrix is filled with measurements.

22 The dollar figures Dr. and colleague of Ronald Fisher who volunteered to care for Fisher’s six children during World War II. Shewhart symbolically placed on the corners of Fisher’s work 80 years ago are today’s Six Sigma costs of quality. Feigenbaum is credited with developing the first dollar based.21 Shewhart was a physicist.23 It is worth noting that this development ultimately can be linked to Jack Welch’s 1990’s Six Sigma initiative. Armand V. In concrete terms. quality reporting system while working at General Electric in the early 1950s. ����������� ����������� ����������� � � � ������� �������� �������� �������� �������� ��������� ���������� ��������� ���������� ��� ��� ��� ���� ���� ������ ���������� ���������� �������� The Costs of Poor Quality The origins of the “Costs of Poor Quality” idea can be traced to Walter Shewhart’s invention of the quality control chart on May 16.20 The quality control chart is yet another way of graphically viewing a vector analysis. He was also the accomplished statistician. A German U-boat’s torpedo sank this plan in 1940. All Rights Reserved. friend. Boyles. Daniel Sloan and Russell A. © M. 2003 . 1924.110 Evidence-based Six Sigma ������������������������� ������������������������ ������������������ ������� Figure 11 Maps help improvement �������� teams identify variables that will be ��������� subjected to a 3D vector analysis. Why are the costs of poor quality so important to Six Sigma breakthrough projects? Simple.

inspections. and information systems. Boyles. a 10:1 return. planning. maintenance. For Six Sigma these costs Cost of Poor Quality (COPQ) flow include training. once a Black Belt gets the hang of the breakthrough project system. Since Feigenbaum first created this classification system. to persuade senior vendor certification using Six Sigma quality metric standards. External failures are mistakes and errors that are highly visible to the customer. often referred to as costs. Appraisal investments include quality audits. education. and Figure 12 Textbook example of a have produced. One dollar saved through the elimination of waste and rework drops to the bottom line as one dollar. If © M. Internal failure costs are hidden factory expenses that remain invisible to customers. chart used by a Black Belt engineer. To a certain extent. and quality assurance system costs. Daniel Sloan and Russell A. One dollar in newly earned revenue can produce as little as one penny in new earnings. Each of four poor-quality cost categories can be leveraged. management to embrace evidence. Figure 12 Prevention and appraisal investments. based decisions. analytic software. saving major dollars is like shooting fish in a barrel. are relatively static. Evidence-based Six Sigma 111 Revenues are taxed. 2003 . prevention investments have been expected to produce. Scott Erickson. Information systems (IS) designed using principles of evidence-based decisions are far less expensive than those that are not. testing. All Rights Reserved.

when a judge makes a mistake. it becomes law. Liability suits. external failure costs are problems that land in the customer’s lap. Failure Mode Effects Analysis (FMEA). engineering changes. All Rights Reserved. the payback is spectacular. Retests. and related equipment required to rework products must all be tallied up. or transferring the information in it to a data matrix. Once the investment is negotiated. it becomes part of ‘scientific law’. Delivering perfect quality services 100 percent of the time is a powerful a business strategy. A process capability index. returns. but when an industrial statistician makes a mistake.112 Evidence-based Six Sigma your company is looking for a place to begin Six Sigma. entails rework costs. known as Cpk. Boyles. marketing and sales errors. © M. woe unto him for he is sure to be found out and get into trouble. Daniel Sloan and Russell A. excess inventory costs. The vast majority of IS systems are modeled after spreadsheets. complaint handling. “I am reminded of the old saying: when a doctor makes a mistake he buries it. Transforming this system. I would add in the same vein: when a scientist makes a mistake in the use of statistical theory.”24 The best way to protect any business from external failure costs is to produce a perfect quality product every time a product is produced. is the analytic measure used in graphic presentations of evidence documenting perfect quality. Finally. Only processes that are capable of producing perfection do produce this level of quality. Shewhart wrote humorously about the reality of 1939 quality costs. This question invariably raises eyebrows. In our increasingly litigious society it is almost impossible to overstate the costs of failure. we encourage you to create an IS strategy that is based on sound geometric principles. Corrective And Preventive Actions (CAPA) and productivity losses are recorded here. The easiest way to learn if your system meets these standards is to ask your IS department to show you their data matrices and cube experiment arrays. Perfect processes can and do produce virtually perfect outcomes. 2003 . warranty costs. Internal failure costs include all scrap and rework.

© M. Evidence-based Six Sigma 113 Process Capability We will use dice rolls for our example. these portions would be scrap and rework. If and when the tails of our curve fall above and/or below our perfection specifications. Daniel Sloan and Russell A. For this example. All Rights Reserved. Or. Therefore. Figure 13’s process capability curve tells us our process is not capable of producing perfection. σ. Six Sigma companies earn “high” Cpk values not by lowering their standards. you can see we set our Lower Specification Limit (LSL) for perfection at 2 and our Upper Specification Limit (USL) of perfection at 12.” These complaints are ludicrous. geometrically. (Chapter 7 will extend this experiment to include 4 die.000 rolls in a minute using software. and exponentially. and dividing it into the spread of the data. Each of us has personally rolled dice more than 5. you can accurately simulate the outcome of 5. Figure 14 shows that our process is a smoking Six Sigma process fully capable of producing perfect quality outcomes 99. Skeptics complain. In this way our teaching analogy is flawed. Let’s game this system and improve our Cpk by setting our LSL and USL perfection expectations at –10 and 30. but by raising them relentlessly. it calculates and estimates statistical limits for a distribution as if it were not constrained.) Feel free to roll your set of dice until you get 5. Statistical software does not know that the outcome of throwing a pair of die is constrained to the range 2 to 12. 2003 . Statistics lie! They can’t even handle dice rolling. Ignore them. software graphs our data and tells us how capable it is.99999 percent of the time! We’re in the money. In real life.000 times. The Cpk value is calculated by taking that old favorite. Boyles. The point in this exercise is a principle. And. Comparing both methods will give you a good feel for the value of Six Sigma vector analysis software.000 measurements. With the click of a mouse button. “See. A Six Sigma process yields a Cpk of 2 or more. Note how tightly the distribution curve is centered on the target of 7. We chose the simulation method for this example. by now you get the point.

Daniel Sloan and Russell A.640. This is why Six Sigma is not a fad. Boyles. The only set of tools that makes this rate of improvement possible is the scientific method and the geometry of a vector analysis. ���������� ������������ ��� ���� ��� ��� ��� � �� To achieve these levels of perfection. All Rights Reserved. ��������������� ����������� �� ����������� ��� ���� ��� ��� ��� � �� © M. 2003 . The only way perfection can be pursued and achieved is by using quantitative measurements and analysis.114 Evidence-based Six Sigma ������������������� ���� �� ������� ����� �� ������� ��� ������������� �������� ����� ���� ���������������� �������� ��������� Figure 13 This process is not capable �������� ��������������� ����������� of perfection. This is why there is such a bandwagon rolling with Six Sigma Breakthrough Projects and 6σ tools. they use a vector analysis applied to a data matrix. ������������������� ���� �� ������� ����� �� ������� ��� ������������� �������� ����� ���� ���������������� �������� ��������� Figure 14 A Six Sigma process will produce perfection every time. Its Cpk value is only 0.

Massachusetts Institute of Technology. 5 Cohen. Edwards.. W. J. Out of the Crisis. Statistics for Experimenters. 8 Deming. a popular leader in the Six Sigma field. New York. Hunter. Cambridge. Economic Control of Quality of Manufactured Product. Cambridge. Walter.asq.com. Bernard. 6 Cohen. 2 The body of knowledge that is widely regarded as the most comprehensive is posted by the American Society for Quality http://www. Brooklyn. 10 Inspiration for this particular grid came from Moresteam. page 5. Belknap Press of Harvard University Press. Page 96. 7 Our bell curve illustrations were inspired by a drawing originally produced by Control Engineering Online. Revolution in Science.P. 11 http://www. 9 As a sidebar note.processmodel.html 3 Mikel Harry. Boyles. John Wiley & Sons. An Introduction to Design. Cambridge. and Model Building.. All Rights Reserved. 1978. http://moresteam. 1982. 2003 . Center for Advanced Engineering Study. reported this history on a video tape recorded in 1995. Hunter. J. it is interesting to know that Gantt patented a number of devices in collaboration with Frederick Taylor when they worked together at the Bethlehem Steel Mill on Taylor’s Scientific Management theory. D. Data Analysis. Evidence-based Six Sigma 115 Endnotes 1 Box. Revolution in Science. Stuart. Belknap Press of Harvard University Press. 1931. Van Nostrand Company.com/ Their on-line Six Sigma Black Belt course is interesting and informative.201.com/ 12 http://www. Bernard. 4 Shewhart.org/cert/types/sixsigma/bok. Pages 170. 1985. George E.fenews. William G. Daniel Sloan and Russell A. J. 1985. Page 96.com/ © M. Inc. Inc.

page v.com http://www. Poor Quality Cost. New York. Poor Quality Cost. New York. Statistical Method from the Viewpoint of 24 Quality Control. ClearVision. New York. All Rights Reserved. H.decisioneering. Walter A.decisioneering. New York. Inc.nestingdolls4u. Jones. Boyles. 1987.com This spreadsheet is used 15 with permission along with the flow diagram for financial models. New York. Life of a Scientist. and Roos. Page 40. Shewhart. Daniel Sloan and Russell A. Harrington. James P. James. 14 http://www. Joan Fisher. Marcel 20 Dekker. Economic Control of Quality of 21 Manufactured Product. 2003 . New York. 22 Box. 1986. Rawson Associates Scribner Simon and Shuster. 17 An interesting history of this symbolism can be found at http://www. Dover. H.org 19 Womack. 1952.com The numbers and layout of this budget come from Decisioneering’s tutorial example. Walter A. Harrington. A. Page 377. Inc. Fisher. 16 http://www. Shewhart.com/history/history. R. 1990...116 Evidence-based Six Sigma 13 Gardner.htm 18 http://lean. New York. 1931.decisioneering. Martin. Dover Publications. Van Nostrand and Company. Daniel. Inc. Marcel 23 Dekker. Fads and Fallacies in the Name of Science. Daniel T. John Wiley and Sons. 1978. D. James. page xiv. 1987. Page 184. The Machine that Changed the World. © M.

someone in the audience thinks. Though names. the story needed to be a fair. We decided to tell them the way clients tell them. Once a wizard’s secret is revealed. It had to be entertaining. “The stories in this chapter upset me. we were unable to completely resolve this perplexing. Boyles. Occasionally. We bit this bullet and chose to include them. each story had to be true. One senior executive reviewer echoed Daniel Sloan’s own 1986 Vice President of Marketing’s sentiments. the stories in this chapter trouble some managers. including the magician. can do it without knowing how. maybe I just took them personally. They hit too close to home.” But no one. Daniel Sloan and Russell A. Recounting a Six Sigma project victory is like explaining a magic trick. “Shoot! I could have done that.Chapter 4 Case Studies C ase studies needed to meet four criteria. It is difficult for me to keep reminding myself that what is past is past. © M. As a senior executive. vexing journalism quandary.” Case studies are essential to understanding. Each example also needed to graphically explain how evidence-based decisions produced crowd pleasing financial returns. Finally. places. All Rights Reserved. I have to keep telling myself that evidence-based decisions can and will prevent me from repeating history. representative sampling of what we each have repeatedly seen over the past 20 years of our professional life. and data were altered to protect privacy. Though we tried. 2003 .

It is no surprise that spreadsheets have © M. Then both are tarred with a brush of cynicism. No analysis method can deliver us from the unethical corruption of reported data.118 Case Studies Magic tricks are illusion. However. a tetrahedron. it is also true that data can be suppressed. All Rights Reserved. Still. Disraeli’s comment “Lies. Many of us are interested in evidence only when it confirms an existing belief or policy. It is transparent. ‘massaged’ or just plain falsified. Vector analysis is based on immutable Laws of the Universe. given good data in a data matrix. Evidence-based decisions put real dollars in real banks. when some guy in a lab coat tells you that such and such is an objective fact. It uncompromisingly tells the truth.” 1 This “know-nothing” doctrine stems in part from inadequate science and mathematics education. Boyles. All supposed ‘facts’ are contaminated with theories. They are also the characteristics that some find most disturbing about Six Sigma. damn lies and Statistics” was a reference to this problem. Confidentiality is necessary in business and government.…he must have a political agenda up his starched white sleeve. and all theories are infested with moral and political doctrines… Therefore. Too often. The cornerstone of evidence. Evidence is a funny thing. symbolizes ‘solid evidence’. confidentiality is used to justify secrecy. full disclosure and international standards for data analysis are the reasons Six Sigma works. This position was summarized in the March 1998 issue of Discovery Magazine: “Anybody who claims to have objective knowledge about anything is trying to control and dominate the rest of us…There are no objective facts. Another human tendency is to equate evidence with authority. Daniel Sloan and Russell A. All aspects are revealed. 2003 . This human tendency creates a resistance to transparent reporting systems in business and government. Transparency. It is contradicted by the documented successes of the evidence-based decisions that power Six Sigma breakthroughs. vector analysis makes it virtually impossible to misrepresent the information in that data.

Because break-even thinking and cost accounting variance analysis allow management to ignore five of the six reality vectors. Once people harvest Six Sigma profits by making better decisions. are equally weighted options. Everyone wants to make more money in less time. Spreadsheets snap tightly to the New Age mantra. These methods are inherently one-dimensional. the cornerstone of evidence. It leaves the trial- and-error methods of old-school management in the dust. Case Studies 119 sensational appeal.2 Transparency and secrecy. Knowledge and reliable information start the Six Sigma DMAIC ball rolling. The department’s Executive Director gave employees the opportunity to choose a consultant to help them in their efforts to maintain current funding levels. Doing more of what works is a doctrine to embrace. “Tell your own truth. graphics. and the New Management Equation to dispel the mystery surrounding evidence-based decisions. computers. Each uses only one of the six vectors in the cornerstone of evidence. cost-accounting variance analysis suppresses five-sixths—83 percent—of the information needed for an evidence-based decision. Customer Service – Governmental Agency Political pressure was forcing a Washington State government department to improve the quality of their services or face the loss of $500. As Master Black Belt teachers. honesty and misrepresentation. with less work. A Five-Minute © M. Spreadsheets are the engines for the cost-accounting variance analysis. we use forthright honesty.” New Age know-nothings can structure data any way they want. Daniel Sloan and Russell A. Naturally. All Rights Reserved.00 in funding as a penalty. and they can analyze it any way they want. software. objections diminish. people tend to construct stories that favor their point of view. None of them recognize the essential Profit Signal and Noise vectors. In this sense. it is easy to construct any story that is consistent with any one vector. 2003 . Boyles. and using fewer resources.

” We suggested that they use a check sheet to collect data. Some days are busier than others. Measure: The team of secretaries who answered the phones claimed they had a good solution to the problem. •Hypothesis 3 (H3): The telephone line made the difference. “All phones will be answered by the third ring. she had instituted a department policy by edict. All Rights Reserved. Some times are busier than others. we promised to help them present their evidence. and she monitored her policy. Define: Jobs. One line was busier than the other one. including management jobs. There were a number of suspected causes. for the unanswered phone flash point. Negative regional news coverage over departmental problems made state citizens angry. 2003 . • Hypothesis 2 (H2):The time of day makes the difference. Using a paper and pencil. These included: • Hypothesis 1 (H1): The day of the week makes a difference. © M. Armed with her own good judgment. A specific criticism concerning this department’s performance had to do with the way it answered its telephones. the team of secretaries constructed a check sheet to record matches with the cube experiment data matrix (Table 1). flow diagrams and a Pareto analysis exposed breakthrough improvement opportunities.” The ringing phones were right outside her office. Several full time clerical staff answered phones that literally rang off the hook. The agency’s executive director knew calls went unanswered. “We can’t get anyone to listen to us. To begin the project. One-half million dollars in legislative funding was at stake.120 Case Studies PhD demonstration and evidence-based decision tools attracted their attention. Answering machines are prohibited because they symbolize poor quality service. Boyles. Poor customer satisfaction had put this department on the legislative target hit list. Daniel Sloan and Russell A. We just do what we are told. were on the line. or hypotheses.

Hypothesis 3 was the “big hitter”. The main effect was so obvious everyone could see it immediately just by looking at the matrix. the back face of the cube. Case Studies 121 Table 1 The cube experiment data matrix guided the collection of data recorded by hand on a check sheet. This proofreading error was embarrassing. Figure 1 presents the data in a cube plot. It would have been expensive to fix. were an order of magnitude larger than the number of calls on line 1. the front face of the cube. No other variable had an effect. ������ ������ ���� ������� ������� �� � ���� ���� ������ ������ ��� ����� ��� ������ Line two was sending a clear profit signal. All Rights Reserved. ��������� ������� ������� Figure 1 All of the high numbers fell ��������� on the back plane. It turned out that line two had been listed incorrectly in telephone directories across the entire state. Daniel Sloan and Russell A. 2003 . No one had the courage to bring it to the attention of the agency’s executive director. Analyze: The data matrix revealed a distinct profit signal. © M. Boyles. The numbers of calls on line 2.

122 Case Studies The executive director’s edict compounded the fear factor. This breakthrough played a role in persuading legislators to sustain funding at existing levels. Rather than addressing the core issue. brought the total value of the project to $525K. The analysis and presentation took one hour. or $420. Six secretaries and other workers could now focus their attention on real work. 2003 . Suspected causes for this © M. Days in Accounts Receivable A service company needed to reduce its number of days in accounts receivable. Six Sigma accuracy. a telephone answering machine was purchased and installed. The total time to collect data for the data matrix was five days.000. Improve: One hour after the presentation of our evidence. We took their evidence forward with a firm conviction that. or AR days. Daniel Sloan and Russell A. Eventually one full time position was eliminated through attrition for a bottom line savings of more than $25. The number of days in AR ranged from 35 to 110 days. Control: Telephone listing corrections were made the following year. They became telephone operators and gave dialing assistance to callers. A breakthrough in the proofreading process ensured 100 per cent. the messenger would not be shot so early in a consulting engagement. This. The answering message announced the Yellow Pages error and then the correct number to callers. Boyles. A breakthrough improvement project could yield as much as 35K per month. The executive director gave this improvement her blessing with a belly laugh. the workforce decided it was much easier to keep their heads down. All Rights Reserved. in at least this case. combined with avoiding the loss of funding. Define: For more than a year debate had raged over what could be done to reduce AR days.000 per year in cash flow.

in order to keep operating costs low. Daniel Sloan and Russell A. and there were hundreds of them. Case Studies 123 problem varied. Long-term customers pay more slowly because they know our business depends on them. the larger the number of AR days. These records were stored in file cabinets. • Hypothesis 2 (H2): Sales calls are the answer. “Spreadsheets work fine. included the following: • Hypothesis 1 (H1): Management is the solution. Good managers have short AR days. Good customers pay fast. or straw men hypotheses. • Hypothesis 4 (H4): The longevity of our customer relationship makes the biggest difference. All Rights Reserved. Moreover. as well as to work on breakthrough projects. Each customer. Billing complexity slows payment. More services create complexity.” We interviewed every employee and constructed process flow diagrams. © M. It was with a great deal of pride that the accounting team showed bills were filed in near perfect chronological order. The number of visits made by a salesman to the customer is key. The Chief Financial Officer of this company was committed to keeping productive hours in line and on budget. The more visits. Workers in his department were required to do their jobs. Boyles. • Hypothesis 3 (H3): The customer is the main reason for long or short AR days. no statistical software would be purchased. Bad managers have lots of days in AR. These suspicions. • Hypothesis 5 (H5): The number of services provided determines the number of AR days. We identified five important variables that might affect AR days. Poor customers pay slowly. The fewer the visits the smaller the AR days. Measure: Significant AR data had been collected. 2003 . A regular work schedule would be continued. No overtime would be paid for improvement tasks. had its own manila folder.

Collecting data that fits the profile of each run is another. They knew that if they found an answer. It is virtually impossible with a spreadsheet. They believed they were familiar enough with customer profiles that they would be able to find bills that would match each of the 32 different “runs” in the matrix. Boyles. it would be valuable.124 Case Studies Figure 2 Statistical software automatically determines the best data matrix geometry for a vector analysis involving five independent variables. A billing clerk and a billing manager volunteered to come in over the weekend and pull records. we used our own statistical software to create an optimal data matrix for five independent variables at two levels each (Figure 2). These two front line leaders wanted to find out what combination of variables actually made a difference. so automatic queries and data mining were out of the question. © M. This took all of five minutes. All Rights Reserved. they simply didn’t have time to array any more spreadsheet data than they already were doing for the CFO during the regular workday. Going to Plan B. Creating a data matrix is one thing. Daniel Sloan and Russell A. The CFO had vetoed a recent budget request for a PC workstation and relational database software. 2003 . Their daily workload was so challenging.

Customer and ��������������������� ���� ���� ���������� �������� �������� Relationship. The every-other-row pattern of a short number of days in AR followed by a long number of AR days was evident immediately to the accounting department workers on a Sunday morning. The computerized vector analysis also showed. Old customers were not. We could say with better than 99. 2003 . with a 99% level of confidence. They still do.999 percent level of confidence that the customer was a main effect.000 PC workstation © M. Customer A was billed electronically. the length of the customer relationship was another active factor that influenced the number of days in AR. New customers were able to bill electronically. ����������������� ���� ���� ���������� �������� �������� The two factors. All Rights Reserved. Analyze: Three strong profit signals emerged from the vector analysis we applied to their data matrix. The CEO had excused the finance department from participation in breakthrough projects “until the data matrix and vector analysis tools proved to be useful. Daniel Sloan and Russell A. the Chief Financial Officer had refused to approve the purchase of a $15. and their interactive ��������������������� ���� ���� ���������� �������� �������� effect. The CFO openly opposed the use of statistics. The company’s Chief Financial Officer and Chief Executive Officer disliked computers. The p-values in Figure 3 appear under the heading “Prob > F”.05 imply a 95 percent level of ��������������������� ���� ���� ���������� �������� �������� confidence or more in the results.” A year earlier. ��������� ��� ������ ����� �� ������ �������� ������� �������� �������� ���� ���� ���������� �������� �������� ������������ ���� ���� ���������� �������� �������� �������� ���� ���� ���������� �������� �������� ������������ ���� ���� ���������� �������� �������� Figure 3 P-values less than ������� ���� ���� ���������� �������� �������� 0. were statistically significant ������������������������� ���� ���� ���������� �������� �������� ��������������������� ���� ���� ���������� �������� �������� at this confidence level. Case Studies 125 Figure 2 shows the first 28 rows of the data matrix with the number of AR days visible in the far right hand response measure column. Anxiety filled the air. ���������������� ���� ���� ���������� �������� �������� �������������������� ���� ���� ���������� �������� �������� ���������������� ���� ���� ���������� �������� �������� �������������������� ���� ���� ���������� �������� �������� The key difference between the two customers was widely known. Customer B was billed manually. Boyles. The main effects were controversial.

The electronic billing and relational data base topics were verboten. A $1. five-factor.126 Case Studies in his department to keep costs down. Following the presentation. Figure 4 presents accurate AR day predictions for Figure 4 The two-level.000 request to purchase data matrix software for the finance and accounting department was denied. 3D vector analysis pictures do not look like bar graphs or pie charts. When the AR Days come factor interactions using the traditional from customer A and a new relationship. Boyles.” Figure 4 explains part of the reason that executive resistance to evidence-based decisions continued. ��������� ���������������� ����������� ��������������������������� ������ ����� ����� ������ ������ ������ �� ������ ������ �� ������������ ������������ ����� ������ ���� ������ ����� ���� �������� �������� ����� ����� ������ ���� ���� ���� ������ ���� ���� ������� ��� ���� ������� ��� ���������������� ����������� ��������������������������� ����� ������ ������ ����� ������ ������ ������ ���� ���� ������ ������������ ������������ ������ ����� ���� ����� ������ ���� �������� �������� ����� ����� ���� ������ ���� ������ �� ���� ���� ������� ��� ���� ������� ��� © M. vector analysis compares all the the top cubes have shorter AR days. This particular finance department found 3D cube graphs to be upsetting. differing combinations of all five factors. Improve: The team spent a week gathering its courage and preparing evidence for a presentation to senior management. “Spreadsheets work fine. 2003 . All Rights Reserved. Note that both of 25. than with any other combination of factors. AR days are lower 3D cube. Daniel Sloan and Russell A. the company purchased and installed a top-of-the-line workstation.

or the purchase of data matrix software. which rank orders profit signals from strong to weak. so did the use of evidence-based decisions. Boyles. As the financial crisis passed. “Spreadsheets work fine.” This experience taught us to present profit signals using a special kind of a bar graph known as a Pareto Chart rather than the more powerful cube. © M.000 cash flow gain in the first year. People just want the answer. Total time required to complete project was 90 days. gives customers what they want in a way that poses no visual threat. The Pareto chart in Figure 5. All Rights Reserved. The heads up improvement team put their heads down and went back to work. Daniel Sloan and Russell A. the company has refused to invest in either the education of its finance and accounting workforce. 2003 . Figure 5 Profit signals are easy to spot using a graph that ranks vectors from strong to weak. To this day. Case Studies 127 Control: Results produced by lowering days in AR by 30 days exceeded the projected $420.

The list of likely causes includes over crowding. imaging. a 30-year veteran with a Masters of Public Health administration degree. The nearby ED physician gave her an apologetic smile and said. She saw vacant treatment rooms. CEO administrator. “What are the standards of evidence you use when you decide to close the ED?” The Charge Nurse responded. We cannot provide safe care if one more sick patient comes through those double doors. Daniel Sloan and Russell A. glanced around. They were all pretending to be charting.” The RN. and a saturated primary care system. the administrator called a meeting to discuss the “ED Divert” issue.” In response. a shortage of nurses.” The following Monday. “Well. Service excellence that meets or exceeds the public’s expectations is essential. the Black Belt RN. a certified Six Sigma Black Belt. directors of the laboratory. Three staff members were cautiously watching her from behind the nurses’ station. environmental © M. We are on divert status. Boyles. JCAHO has instituted new Emergency Department Overcrowding Standards requiring a hospital’s serious attention. we are simply overwhelmed. ED and Critical Care nurse managers. It often accounts for a significant percentage of all admissions.128 Case Studies Breaking the Time Barrier3 Long waits in hospital emergency departments are legendary. selected the ED as the hospital’s initial Six Sigma Project on her first day of work. The ED Charge Nurse had told her. a shortage of inpatient and/or long term care beds. 2003 . The Joint Commission on Accreditation of Health Care Organizations (JCAHO) has recognized the critical nature of Emergency Department (ED) overcrowding. “Our ED is closed to ambulances. an aging population. after reviewing the hospital’s admission data and top revenue producing departments. You might as well get used to it. The newly hired administrator of a community hospital. The ED Medical Director. “This happens all the time. The Emergency Department is the front door of a hospital. All Rights Reserved. CEO administrator asked.

or 12% of available time. © M. . why that would solve the problem. Why at Mid-Valley. The Team began planning steps in the Six Sigma DMAIC breakthrough process. They ran it and analyzed their data. their ED is closed twice as much as ours.” “The CT tech takes call from home after midnight. Case Studies 129 services. by “other” departments in the hospital. that they firmly believed was the primary cause of ED divert. Boyles. 8 hour shifts. The Black Belt CEO led a brainstorming process to identify Critical To Quality (CTQ) factors. They designed an experiment. Those closures penalized patients. “Every hospital in the city is having the same problem.” “If you want us to put our nursing licenses at risk. . Daniel Sloan and Russell A. No one in the ED was familiar with Six Sigma techniques or tools. . It was an inevitable result of growing volumes. During the past six months. . the Emergency Department had been closed or on diversion (divert) more than 5300 minutes/month. The list of suspected reasons for going on diversion status were as varied as the professional team that sat around the table. and the Emergency Medical Treatment (EMT) director from the local fire department responsible for the paramedics all attended. well . in large part. Everyone was resigned to the status quo. Nevertheless they began to wrestle with a complex process that involved most of the hospital’s departments. 2003 . This is a crucial DMAIC first step no matter what the project is.” “There are never any beds in the ICU. We’re always waiting for her to come in. No amount of effort could reduce ED divert time. The general theme was that Emergency Department diversions were caused. from Admitting to X-ray. they were surprised.” “If the Cath Lab crew was in-house 24/7. . All Rights Reserved. or three and two-thirds 24 hour days. This totaled eleven.” As they reviewed the actual numbers from data that had been collected and arrayed in a data matrix. Every member had her or his own favorite reason. or two or three. They cost the hospital hundreds of thousands of dollars in potential revenue.

a 48% reduction from the same period the prior year.9 hours.6%. • The number of Emergency Department visits increased by 12. Define issues systematically.6 hours to 1. • Intensive Care Unit bed availability increased by 10. • A 38. The admission process simply slowed to a near stop.26% increase in Emergency Department gross margin was generated. The department was astonished. Any amount of wait time could be rationalized. Boyles. everything changed. All Rights Reserved. sustainable results.130 Case Studies Break Through Results The profit signal vector analysis showed that once a decision to admit a patient was made. • Patient satisfaction increased from 59% to 65%. performance pressure was off.6%. Once this practice was halted. • The average ED Length Of Stay (LOS) shrank from 3. The application of DMAIC to this hospital’s ED overcrowding and diversion problem produced dramatic. DMAIC DMAIC is the standard Six Sigma breakthrough project methodology. © M. Issues identified included time on ambulance divert. In the first two months their initial project results list included: • The average hours on ED Divert dropped from 88 to 50 per month. An “acceptable” wait time for a patient being admitted was open- ended. • Catheterization Lab time dropped from 93 minutes to 10 minutes. measurable. 2003 . statistically and practically. Daniel Sloan and Russell A.

a decision to admit or not. Improve the process using evidence-based decisions to power Six Sigma breakthroughs. The team established performance measures and targeted benchmark targets for each goal. CTQ variables identified for evaluation were the patient’s gender. and considerable lost revenue.0. patients leaving without treatment (LWOT). diagrams. Door-to- door Length of Stay (LOS). Ready availability of an ICU bed. © M. observe the process and begin data mining. The data were gathered in less than 24 hours (Figure 6). All Rights Reserved. Prepare appropriate quality control charts and Design of Experiments (DOE) to determine CTQ factors. and process flow diagrams. Boyles. They identified potential Critical to Quality (CTQ) variables they believed influenced the department’s Length of Stay (LOS). Evaluating these 8 factors required only 16 runs to complete. Control the process to insure that break-through improvements were sustained. created an 8-Factor Designed Experiment on her laptop. Analyze data using a vector analysis applied to a data matrix. Left Without Being Treated (LWOT) as a proportion of all patients. laboratory and imaging testing were other variables. 2003 . Identify and array CTQ factors. low patient and staff satisfaction levels. models. and patient satisfaction levels were identified as the CTQ factors the team wanted to study. using data matrix software. The Black Belt Administrator. Measure using maps. They prioritized reducing (minimizing) Emergency Department Length of Stay (LOS) as the key response. Wisdom Gained Along The Way The knowledge experts on the ED Six Sigma team used inductive reasoning. Case Studies 131 unacceptable ED patient length of stay (LOS). Collect data. Daniel Sloan and Russell A. Everyone felt that all other issues would improve if LOS could be reduced. a “slow” and “fast” physician or nurse (identified by employee number). JMP 5.

Six Sigma techniques. Before the project.132 Case Studies Figure 6 Custom design for an 8 factor.” Don’t trust your assumptions or your “gut. Daniel Sloan and Russell A. © M. 2003 . All Rights Reserved. This confidence level helped managers make critical decisions quickly. Boyles. The results of the Designed Experiment (DOE) were surprising to members of the Six Sigma Project team (Figure 7). This was one of many “ah ha’s. Emergency Department Length of Stay (LOS) Experiment. ED staff and physicians ranked laboratory turnaround time as the most significant CTQ factor that influenced the length of stay in the Emergency Department. Figure 7 Results of an 8-Factor Designed Experiment. CT technician availability ranked a close second. two level.” even if you are an expert. including a carefully designed experiment and rigorous data analysis (computerized software makes it easy) provided evidence at the 95% level of confidence.

ED and ICU service medical directors developed patient admission and transfer criteria that were approved by the © M. admission status and availability of an ICU Bed. she quickly uncovered an insidious attitude. There was a related assumption that the EMTs expected an ICU bed to be immediately available or they would take the patient to another hospital. Yet. Daniel Sloan and Russell A. may appear obvious now.” Staff and physicians operated under the false assumption that “most” ambulance admits were very sick and “nearly all” would require an admission to ICU. the most frequent reason given for instituting the ED diversion status and closing it to customers was “No ICU Bed.” Nurses (and.99% confidence level of significance. He voluntarily educated his staff so they would rely on the judgment of the ED staff. Those patients being admitted had a significantly longer Length of Stay than those who were treated and released. Attitudes quickly changed. When the administrator began discussing ICU bed availability with the nursing staffs in the ED and ICU. Running a close second. It was. Finding these two highly significant factors focused the team’s efforts. Boyles. physicians) believed that “their” department worked harder and the “other” department was attempting to shift their workload to them. Nurse managers evaluated and resolved issues between their departments. they were not at the outset of the project. “Us against them. to a lesser extent. A drill-down of data revealed that less than 9 percent of the patients who entered the hospital’s ED by ambulance were ultimately admitted to the ICU. While these CTQ factors. The lack of trust between the ED and the ICU required immediate attention. was the availability of an Intensive Care Unit bed. All Rights Reserved. Case Studies 133 The Project Team discovered the CTQ factor with the most impact on ED LOS was admission status. The small percentage of admissions to ICU was a surprise to the EMT medical director. They arranged schedules and provided time for nurses to “walk in the shoes” of nurses in the other department. Nurses gained an appreciation of the unique and essential role each service provided to quality patient care. Both were at the 99. 2003 .

Daniel Sloan and Russell A. All Rights Reserved. Telephones might go unanswered in the ICU due to the immediacy of patient care needs. Patients in the field with a potential diagnosis of myocardial infarction (MI) were evaluated by the EMTs.) Before the ED project. The ED nurse was required to provide a detailed report to the ICU nurse before she could initiate a patient transfer. average door-to-cath-lab-time was a respectable 93 minutes. Door-to-cath-lab- time plummeted to 10 minutes! © M. including completion of lab work and a repeat EKG. unless ICU notified the ED to hold the patient. A transfer of the patient to the ICU occurred automatically 30 minutes after the report was faxed. The delay in calling in the cath lab team cost precious heart muscle-saving time.134 Case Studies Medical Staff. 2003 . Working together. Communication between the two departments was difficult. An unintended but exciting result of the Six Sigma ED Project was a stunning reduction in “Door to Cath Lab” time. The criteria. While this time met national standards. based on a patient’s need for intensive nursing care. EMTs transported their most critical patients to the competitor hospital because of their superior door-to-cath- lab-time. before the cath lab team was notified. Drill down analysis of outcome data revealed that the EMTs diagnosed MI with nearly 100% accuracy. With the support of the administrator for the potential cost of additional cath lab salaries in case the EMTs diagnosis was incorrect. in consultation with the ED physician. it was longer than the hospital’s nearby competitor. the nursing staffs developed a 1-page report that the ED nurse would fax to the ICU in the event they were unable to complete a telephone report. Boyles. This is now an uncommon occurrence. A flow process diagram revealed the problem. ED staff were encouraged to rely on the EMT’s field diagnosis and initiate the call to the cath lab team as soon as the EMTs called in from the field. they were re- evaluated by the ED physician. authorized nurses to transfer patients out of ICU to open a bed for a new admission. (This is a measure of time from the patient’s arrival in the ED to the initiation of treatment in the cardiac catheterization lab. When they arrived in the ED.

Hill. This is all normal. All departments and staff learned to value ‘their’ ED as the ‘front door to their hospital. Historically speaking. loss. Daniel Sloan and Russell A. and a good experiment well reported may be more ethical and entail less shirking of duty than a poor one. economic pressure drives improvement. 1951. This was a 38. off-pump surgical technique provides an ideal compass setting that points the way to breakthroughs.” (Br. Case Studies 135 Effecting and sustaining significant change is hard work. All Rights Reserved. experimenting on human beings. whether we like it or not.000 patients and realized a gross margin of $18 million. Again.’ At the end of the first year. Med 2:1088-90. The success of this project had a positive impact across the hospital. with an ED diversion time of near zero. Multi-million dollar savings created by “beating heart” or “off-pump” coronary artery bypass outcomes are a case in point. People experience roller-coaster emotions of fear. Boyles. 1952) The ability to consistently replicate experimental outcomes with a high degree of confidence is of paramount importance to everyone in the health care system. The need for change creates strong emotions in people. Since health care Six Sigma © M. “Beating Heart” Bypass Grafts Though altruism and evidence influence medical treatments. 2003 . A critical function of the Black Belt is to manage people’s feelings and emotions so improvements occur and are sustained. Near zero death rates related to surgical anesthesia and the polio vaccine’s safety record are but two near perfect success examples. and denial. particularly when you are the one who is expected to change. Sir Austin Bradford Hill’s 1951 sentiments sound as fresh as a 21st Century General Electric Six Sigma news release: “In treating patients with unproved remedies we are. before reaching acceptance. the Emergency Department treated over 37. medical “Six Sigma” style breakthroughs have astonished the world.26% improvement over the previous year.

Analyze. provides a convenient way to summarize this story. the use of cardiopulmonary bypass (CPB) pumps defined coronary artery bypass grafting (CABG) procedures. the prevailing beliefs of cardiac surgery sustain physician commitment to the on-pump surgical technique. Patient demand for this lower cost. It has taken a decade for surgical practice patterns to emerge that reflect sentiments expressed by researchers in 1992. USA health care markets. surgeons to master a challenging. higher quality procedure has forced. The closed feedback loop idea is a serious theoretical error that can be traced to the 1990’s pseudoscience of “systems thinking.4 Statistically significant blood utilization and neurological side effects associated with on-pump surgeries were considered to be acceptable—necessary— collateral damage related to the bypass operation. Good outcomes and the relative ease of working on an arrested heart led most cardiac surgeons to favor the use of CPB. higher standard of care. Database and computing systems accelerate both when they are included in an open system feedback loop. and Control. standard Six Sigma closed feedback system. Compelling statistical evidence is leading to the reluctant acceptance of this surgical technique in competitive. Limited financial resources fostered the early 1980’s development of “beating heart” CABG surgeries in Argentina. spreadsheet analyses and story © M. “off-pump” coronary artery bypass grafts (CABG) projects are substantive. the classic evidence-based decision cycle. Boyles. Measure. Define. All Rights Reserved.”5 Patterns and pattern recognition are key elements in the identification of breakthrough improvements. Again. 2003.136 Case Studies breakthroughs simultaneously improve the quality of patient outcomes and profitability. Improve. and is forcing. Though statistical evidence suggested off-pump operations were safe and advantageous for select patients. Closed feedback systems are driven by opaque.”6 Closed feedback loops create entropy. Define: For over 40 years. groups of patients should be confined to on-bypass operations. if any. “Further research should be directed to which subgroups can be operated on to advantage off-pump and which. 2003 . Daniel Sloan and Russell A. Figure 8 illustrates the classic.

Integrated statistical software packages now make it possible to analyze measurement data almost as quickly as they are recorded. Closed loops create entropy. ������������������������� ���������������� ������������������ ������������������������� ������������������������� ���������������� ����������� ������������� telling where 83 percent of the information contained in raw data are suppressed. Daniel Sloan and Russell A. In the off-pump/on-pump dialogue. Without a commitment to evidence-based decisions. increasingly this data is automatically entered into databases. allied health professionals. Open feedback systems depend upon the continuous entry and flow of objective evidence into judgments. after a number of his patients canceled their scheduled on-pump surgeries in order to have them performed off pump by a different © M. All Rights Reserved. In addition to quantitative. one qualitative signal is the long running practice of opinionated debates between surgeons. nurses. Case Studies 137 �������������������� ���������� ������������������� ������������� �������� ��������� ������������������ �������� ����������� ������������������ Figure 8 The recommended Six Sigma closed loop feedback system is contrary to evidence-based decisions. open loop feedback measures. and administrative leaders are the Six Sigma “executive champions and Master Black Belt” experts who initiate breakthrough improvement actions. 2003 . Figure 9 shows columns and rows of data for a single cardiac surgeon who. these discussions are generally sustained without referencing or generating statistical evidence for analysis. Evidence-based decisions must have open feedback systems. Obviously doctors. Boyles. Measure and Analyze: Though surgical practice data are often collected by hand. qualitative impressions frequently expose opportunities.

The peer-reviewed literature on this topic is consistent to a remarkable degree. � � On the hyperspace vector � � analysis applied to a data matrix �� � �� thrill ride. the difference between ���� � data sets is significant at the 95% ���� �������� ������ �� ������������ confidence level if the saucers ������ �� ������ �������� ����������� ������� �������� can fly past each other without ���� � ������ ���������� �������� �������� �������� ����� ������ ���������� �������� crashing. The computerized analysis of length of stay data in Figure 10 reflects findings that are similar to the 443 peer-reviewed articles published on the on-pump/off-pump subject since 1992. Boyles. Patients who undergo off-pump CABG surgeries experience dramatically lower lengths of stay. All Rights Reserved. decided to master the off- pump surgical technique.138 Case Studies surgeon at a competing hospital. ���� ������������������ ������������������ �� �� Figure 10 The strong profit signal �� � between the lengths of stay for on ������������ � pump and off-pump surgeries are � eye catching with a statistically � � accurate “flying saucer” graph. This array documents charges. lengths of stay (LOS) for patients and type of CABG surgery either off-pump or on. Daniel Sloan and Russell A. Figure 9 A data matrix arrays historical data so a vector analysis can be used to identify profit signals. �������� ������ ���������� ����� ��������� �������� ����� ������ ���� ��������� ��� ������ ��������� �� � ����� �������� ������� ������� ������� �� ����� �������� ������� ������� ������� ������������ �������������� ��������� ��������������� © M. 2003 .

A quality control chart. so does variation around the mean. Even a novice can interpret the results at a glance. all four of the shortest lengths of stay related to CABG are located on the cube’s left plane. All Rights Reserved. Since 1931.875. Daniel Sloan and Russell A. The numbers contained in the rectangular boxes at the cube’s corners are average values. Factors we considered were diagnostic (ICD) code variations. The Cartesian coordinate system’s cube is an ideal graphic for presenting multidimensional statistical evidence. An example is shown in Figure 12’s cube plot. gender. age. As the average length of stay shrinks. co-morbidities. All of the © M. Case Studies 139 Figure 11 The Profit Signal in patient Lengths of Stay (LOS) were related to off-pump CABG surgeries. was a result of an off-pump surgery with a male patient with ICD code 36.11. Figure 11. These breakthroughs now lead to near perfect performances known as Six Sigma. The shortest average length of stay. 1. The surgeon’s database was stratified to facilitate a three- dimensional statistical analysis to consider the effect a number of other factors might have had on length of stay outcomes. 2003 . Literature searches used to cross check statistical inferences are a value added service physicians appreciate. provides another view of the impact off-pump surgical technique brings to the quality of patient care. Boyles. and race. In Figure 12. These improvements were dramatic. this pattern has symbolized the classic breakthrough pattern of an evidence-based decision.

they produce medically superior outcomes and lower lengths of stay. Simulation modeling using spreadsheets is a relatively easy data matrix tool to master. All Rights Reserved. leaders must prioritize cost accounting if they expect to see system wide improvements take place. spreadsheet simulations are persuasive. Six Sigma style. The longest average length of stay. evidence charts. the data matrix software used to produce the evidence in this case did not have that feature. 2003 .000 or more iterations of multivariate spreadsheet practice scenarios is significant. The psychological impact of seeing 1. 6. Before changes occur in physician or hospital practice. Improve: Sixteen years of experience in promoting breakthrough improvements in health care quality and productivity teach an important lesson. Though this reality can be disheartening for caregivers who put patient safety first. In addition.12. Boyles.875. © M. the only statistically significant factor related to a lower length of stay was a surgery performed off-pump. the organization had progressed beyond the need to present data in a simplistic way. Daniel Sloan and Russell A. More often than not.140 Case Studies longer lengths of stay are located on the cube’s right plane. Decision makers wanted to look at advanced. Figure 12 Profit signals compare the surgeon against herself. was the effect of on pump surgeries for men with ICD code 36. We can say with a 95 percent level of confidence that when off-pump surgeries are used on appropriate patients. This case study did not include a Pareto chart analysis summary for two reasons. benefits must be translated into a compelling financial story. Though three factors are presented simultaneously. First.

When the medical staff and other senior leaders are disciplined. statistical analysis. Analyze. On the high end of the distribution.4 million. Control: The final step in the Six Sigma DMAIC (Define. persuasive use of the data matrix and profit signal analysis. Discipline is as important to success here as it is with each of the other steps.45 million. Daniel Sloan and Russell A. Revenue gains for off- pump surgeries are predicted to range from a net gain of 448K to $1. breakthrough improvements occur. Actual results fell near the center of the prediction parameters. All Rights Reserved. Six Sigma culture evolves along with the breakthroughs. Figure 13 Spreadsheet add-ins for modeling and simulation are a compelling. Boyles. this change could produce as much as $1. an additional 448K in revenue would be generated. and when they role model the use of science. Improve and Control) process is to standardize breakthroughs and hold the gains. © M. Leadership and culture determine the rate of adoption for breakthroughs in productivity and quality. These results are classic hallmarks of a Six Sigma style breakthrough. The low end of the forecast’s distribution suggests that by mastering the off-pump procedure for the majority of her patients. 2003 . Off-pump patients avoided adverse side effects while the hospital enjoyed improved profitability. and systematic experimentation. The degree of success in every Six Sigma breakthrough is directly related to the level of commitment that is demonstrated by senior leadership. Savings were achieved through lower nursing care costs and overhead. Case Studies 141 Figure 13 shows the profit signal’s probable financial impact for one surgeon. Measure.

2003 . the area manager and the supplier rep to discuss the project. Day after day. The supplier representative had given the area manager a rule to use for deciding when the grinders should throw a belt away and put on a new one. They went through a lot of belts on a typical shift. To our surprise. He said we were wasting time trying to “reinvent the wheel”. All Rights Reserved. the supplier rep was vehemently opposed to the project. The other major expense for the area was the cost of belts. so we let it go. © M. He had a hypothesis that using the belts a little longer would reduce the belt expense with no loss of grinding efficiency. Daniel Sloan and Russell A. your efficiency in removing metal goes way down. We had no way to evaluate this. he and his co-workers removed “gate stubs” from metal castings to prepare them for final processing and shipping. We met with Don. Boyles. The grinders were paid a handsome hourly rate. There were examples of belts that had been “50% used up” hanging on the walls in the grinding area. Don had come up with a new rule called “75% used up”. Don thought the rule was wrong.142 Case Studies The Daily Grind Don worked in the belt grinding department. Define: If you try to use a belt beyond a certain point. The purpose of the rule was to minimize the total expense of the operation. He proposed doing a designed experiment to determine whether or not the new rule was more cost effective than the old rule. He also suspected that the supplier wanted to sell more belts. He said the “50%” rule was based on extensive experimentation and testing at his company’s R&D laboratory. He thought it caused them to discard the belts too soon. The rule was called “50% used up”.

© M. One of the grinders wanted to try a wheel with a higher LGR. There were four factors at two levels each. Case Studies 143 Don argued that laboratory tests may not be good predictors of shop-floor performance. Table 2 The data matrix for Don’s grinding experiment. Boyles. It also suggested that high land-to-groove (LGR) is better than low. Analyze: An eyeball analysis applied to Table 2 suggested that Don was on to something with his “75% used up”. Another wanted to try a contact wheel made out of hard rubber instead of metal. Measure: Don figured he could get 16 castings done in one day. and rubber wheels are worse than metal ones. We thought he had a point. He felt that both grits should be represented in the experiment to get realistic results. We were also starting to see why he was suspicious of the supplier. All Rights Reserved. The contact wheels currently used on the grinding tools had a low land-to-groove ratio (LGR). When the other grinders heard about the experiment. Table 2 contains the data matrix for the grinding experiment as it was eventually run. The response variable was the total cost for each casting divided by the amount of metal removed. He gave the go-ahead for the project. He allowed Don one full day to complete the experiment. The total cost was calculated as labor cost plus belt cost. 2003 . The area manager also thought Don had a good point. A third reminded Don that belts of at least two different grit sizes were routinely used. Daniel Sloan and Russell A. they suggested other things that could be tested at the same time.

Daniel Sloan and Russell A. 2. Use his 75% rule instead of the supplier’s 50% rule. It predicted significant savings in line with Don’s idea. 2003 . but everyone was happy. Boyles. This signal told us that rubber was not a good idea. Not bad for a one-day project. Use contact wheels with the higher land-to-groove ratio. The combined impact of these two changes was a predicted cost reduction of $2. © M. Figure 14 shows the Pareto Plot ranking the factors and their interactions by the strength of their profit signals. But let us not be hasty. This multiplied out to about $900. The next-largest signal was the comparison of the 50% rule to the 75% rule (USAGE).000 in annual savings. The next two signals involved interactive effects.144 Case Studies Figure 14 Pareto Plot ranking the factors and interactions in the belt grinding experiment by the strength of their profit signals. The strongest signal was the comparison of steel to rubber contact wheels (MATL). Improve: Don’s experiment produced two recommendations: 1.75 per unit of metal removed. The third-largest signal was the comparison of a low to high land-to-groove ratio for the contact wheel (LGR). Don’s recommendations were quickly implemented throughout the grinding department. The message here was that the actual cost reductions from implementing the USAGE and LGR results would different for the two grit sizes. All Rights Reserved. The actual savings came in a little under the prediction.

The extruder bears the development cost in exchange for a life-of-contract “sole supplier” status. As a result. The number of revisions required to get a new die ready for production varies unpredictably from 0 to as high as 30. testing. All Rights Reserved. the tester does visual inspections and measures the control dimensions with a caliper. If it is. “Die Tuning” for Vinyl Extrusion A vinyl extrusion operation receives a “die package” (blueprint) from a customer for a new “profile” (part). Define: We started with a “Kaizen-blitz”.5 to $5. Once the production line stabilizes. The process of machining. The tester is also supposed to determine the best run conditions for the new die. 2003 . There was still a lot of variability in grinder performance. a very fast and focused review of the die tuning process. An extruder can easily spend $1. The inspection results and the dimensions are taken to a revision programmer who determines whether a revision is needed. the total cost varies unpredictably from $2000 (no revisions needed) to something like $50. Case Studies 145 Control: Some degree of cost reduction was achieved by all the grinders. Daniel Sloan and Russell A. Each “revision” involves re-machining the die.000 (lots of revisions needed). Boyles. Reducing the dramatic variation in the number of revisions was identified as a project with potentially huge financial benefits. Potentially these factors could include: © M. The average cost per revision is about $2000. the revision programmer sends the die back to the machine shop with a revision sheet describing the needed changes. but it did not apply uniformly. a tester runs that die on one of several extrusion lines reserved for testing new die. Attacking this variation was the obvious next step. and revising dies is called die tuning.8 million or more each year on die tuning. We don’t know if our recommendation was ever implemented. Once the initial machining of a die is completed. The extruder then designs and machines the “die” (tooling) for extruding the profile.

All Rights Reserved. Letting testers choose which variables to adjust may have long-term economic consequences. 2003 . let’s see if we can “process our way out” of some of the dimensional or cosmetic problems. This would require more time for each revision cycle. The basic idea was this: before we cut metal again.146 Case Studies • Line speed • Die-to-calibrator distance • Calibrator vacuum • Screw Revolutions Per Minute (RPM) • Screw oil temperature • Barrel zone temperatures • Die zone temperatures • Melt temperature • Melt pressure • Weight Testers are under time constraints. Daniel Sloan and Russell A. Item 1 looked like a possible “smoking gun” for the problem of too many revisions. We proposed that small series of designed experiments be made a routine part of die tuning. The variables most commonly adjusted are line speed. The other variables tend to remain at “baseline run conditions” assigned before the die is machined. Examples are lowering the line speed or increasing the weight. © M. 2. The trial-and-error method has virtually no chance of finding good run conditions. They adjust some of these variables by trial and error to get the dimensions closer to nominal and improve the cosmetic quality. Boyles. Our findings were as follows: 1. But this process held the promise of dramatically reducing the number of revisions. Die revisions were based on single measurements taken by a hand-held caliper on plastic parts. In all industries the repeatability of such measurements is notoriously bad. 3. die-to-calibrator distance and weight.

The control dimension data are expressed as deviations from nominal in thousandths of an inch. The matrix in Table 6 is the as-run version with the weights and calibrator vacuums actually obtained in place of the nominal values in the original matrix. Some of the team members wondered how it could help with Item 1. The levels of the four factors are coded to protect proprietary information. All Rights Reserved. Daniel Sloan and Russell A. each column is a single production line produced perfect quality product after just entity or vector. Improve: The implications were staggering. The statistical tuning experiment. the team decided to observe four continuous factors: line speed. where higher is better. calibrator vacuum and weight. Boyles. The answer was that the results of a DOE are always based on weighted averages rather than individual measurements. By doubling Remember. Case Studies 147 We felt the Design of Experiments (DOE) approach could address all three. better. © M. Additional key findings were as follows: • We were able to run a four-factor die tuning experiment in one day. cornerstone of evidence into Noise and Profit Signals. The die in this case had a dual orifice. 2003 . Table 6 The data matrix that fills Analyze: A matrix of distribution curves was the result of the next page is from the die jointly optimizing all 14 response variables. where higher is introductory book. This means that two profiles are extruded at the same time. Measure: For the initial experiment. The quick story follows. Results for the two profiles are distinguished in the matrix as Sides 1 and 2. A correct analysis breaks up the variation vector in the one revision and some very minor additional die tuning. We used statistical software to generate a data matrix similar to one shown in the first six columns of Table 6. The response variables of this statistical graph exceeds the boundaries of this included 13 control dimensions and a 1-5 distortion rating. die-to-calibrator distance. The responses included 13 control dimensions and a 1- 5 distortion rating. because this table is a the line speed and reducing material costs by 50 percent the data matrix. continuous factors at three levels Please accept our apologies for the fact that the complexity each. This automatically improves the reliability of the data used to determine revisions. There were four software performed this optimization in just a few seconds.

Boyles.148 Case Studies © M. Daniel Sloan and Russell A. All Rights Reserved. 2003 .

Albert J. In one case a die was saved in the nick of time from going back for an incorrect revision that would have spawned further revisions to repair the damage.” Ann of Thorac Surg 1992. • We showed that using weight and line speed as adjustment factors in die testing lead to unnecessarily high weights and low line speeds. More is expected. Much has been accomplished. an RN. Matt. Control: The process of changing the way die tuning is done is underway.com/presidents/aae/side/knownot. others contradicted prior beliefs. It may also contribute to problems with quality. “Coronary Artery Bypass without Cardiopulmonary Bypass. Boyles. et al. pages 78-83 as reported by Richard Dawkins on page 20 in his book Unweaving the Rainbow. © M. Cheryl led the charge for the use of Designed Experiments in health care in 1995 with Daniel Sloan. Salah. March. Some results confirmed prior beliefs. which in turn lead to a larger numbers of revisions. 2 http://gi.grolier. 54:1085-92. Daniel Sloan and Russell A. Case Studies 149 • We generated a wealth of information on how each factor affects each response variable. A conservative estimate of the annual cost reduction from extending this method to all new die was $1.2 million. M. half of the current annual budget for die tuning. “Oppressed by Evolution”. This locks in unnecessary costs for the life of a contract. Results from those early innovations were published by the American Society for Quality’s Quality Press. Endnotes 1 Cartmill.html 3 Cheryl Payseno. Similar experiments have been run on other new die with similar results. 2003 . former hospital administrator and certified Six Sigma black belt wrote this case study for us. Discovery Magazine. All Rights Reserved. 4 Pfister. 1998.. Zaki.

The Art and Practice of The Learning Organization. Salah. et al. M.150 Case Studies 5 Pfister.”Ann of Thorac Surg 1992. New York. The Fifth Discipline. 1990. Doubleday Currency. © M. 2003 . Albert J. 54:1085-92.. All Rights Reserved. “Coronary Artery Bypass without Cardiopulmonary Bypass. Peter M. Boyles. Daniel Sloan and Russell A. Zaki. 6 Senge.

All Rights Reserved. When all you have a sledgehammer. they may not necessarily be productive. The spreadsheet is the first and only computing program many business people learn to use. Daniel Sloan and Russell A. and trying to explain why actual monthly financial results do not fall exactly on the predicted straight line of a one-dimensional “variance” analysis. This natural occurrence unsettles to old-school managers. These people are occupied reworking Proformas. Boyles. Once the tools have done their job.Chapter 5 Using Profit Signals P rofit signals show you the money. People want to play with them.” Vector analysis applied to a data matrix is the steam engine that humbles them. They want to use them. © M. health care. I’ll die with a hammer in my hand. telephones and the Internet. manufacturing or service process. This chapter explains how vector analysis applied to a data matrix showcases the information contained in raw data. Profit signal vectors literally and figuratively show you what works best in any business. the graphic presentation of evidence paves the way to breakthroughs in quality. Though they are busy. financial. Hammering out spreadsheet revisions keeps employees occupied. Some react like the mythical John Henry: “Before that steam drill shall beat me down. 2003 . Profit signals are like televisions. business plans. They attract attention. cars. productivity and profitability. everything looks like a spike. radios.

Profit signal pictures are aesthetically pleasing. Rigorous inductive and deductive reasoning. 1. Each number is framed in the geometric context of a profit signal vector. 4. In a data matrix. In other words. All Rights Reserved. Each column is a field or variable with a precise operational definition. which dates back to Aristotle. 2003 . Money. Physical models win hearts. is THE big reason Six Sigma projects are so popular around the world. is built into statistical software designed specifically for the data matrix structure. you can produce 10 times the work in a fraction of the time now spent doing arithmetic with a spreadsheet. people can weigh evidence in their own hands. Profit signals help you make more money with less work. The following are a few of the many reasons why so many former skeptics embrace the use of profit signals to make more money. By constructing a cornerstone-of-evidence tetrahedron using bamboo skewers as vectors and spheres of Sculpey Clay as points-in-hyperspace connectors. number 5. With profit signals.152 Using Profit Signals Fortunately. With profit signals. A Better Way to Look At Numbers Think back to your Five-Minute PhD. Daniel Sloan and Russell A. you don’t have to solve equations. 2. With profit signals. 3. convert would- be 21st Century Luddites into evidence-based decision champions. Each column of numbers is its own vector. a data matrix channels the intelligence and logic of the best minds our human species have produced. Boyles. © M. the cornerstone of evidence is appealing. you have only one formula to remember. The look and feel of an Analysis of Variance tetrahedron in one hand and a single stick in the other. 5. each number is an integral part of an entity called a vector.

First. was the Fisher Professor of Statistics. The results will be more rewarding. Professor George E. It costs about one buck for the whole kit.” Finally. Massachusetts. Box. P. pointing to the money. These properties can be measured and displayed in three dimensions. “You could tell the answer just by looking at the numbers on a cube. Corrugated Copters C. There are no vectors.5 inch by 11 inch paper in half. tear a piece of 8. follow the folds at the bottom to form the helicopter’s © M. We strongly urge you to actually build a Sculpey-Clay/bamboo skewer model whenever the dimensions of a vector analysis are revealed to you in one of our examples. Box introduced us to it at the University of Wisconsin. Rogers created the helicopter analogy while working at Digital Equipment in Marlboro. Each number is an orphan locked in its own cell. a Fellow of the Royal Society and the American Academy of Arts and Sciences. Boyles. A ghost named Zero inhabits empty cells. 2003 . tearing paper works fine. All Rights Reserved.” He and his colleagues used the helicopter in Figure 1 to illustrate. long ways. Using Profit Signals 153 Vectors show you the money. cut or tear the top section to form the “blades. Next. Please take a moment to build one now so you can follow along with our data explanation. B. no arrows. Measurements presented in the rows and columns of a spreadsheet convey no sense of unity. Dr. If you have a pair of scissors and quality paper use them. If you are in a hurry. Vectors have physical properties. Box was also a riveting teacher who taught us that an analysis of variance was so simple. Arithmetic is the two-stroke engine running Abacus Prison. These tools will make the construction process more satisfying. Commonsense relationships between numbers are ignored. Madison in May 1995. Logic takes a back seat to manipulation. Daniel Sloan and Russell A. There is no sense of purpose.

the blades will catch air while the aircraft spins to the ground. Boyles. Eliminate all costs associated with take offs and you can really make money.” This saying has become a ritual chant that opens all management meetings. ���� ���� ���� ���� ���� ����� ����� ������ fuselage.1 Their original corporate slogan was. customers are willing to pay an additional $1 million in price. Daniel Sloan and Russell A. Like seeds from a maple tree. 2003 . This is fun to do and fun to watch. or pink plastic digital chronometer you wear on your wrist. Corrugated Copters learned a big lesson when their company was founded in 1996. Let it drop. “Drive down costs!” Their current. Now. time the flight using the black.154 Using Profit Signals ������ ��� Figure 1 This inexpensive product is an analogy that works well for ������ teaching data matrix and vector analysis principles to people in all industries. purple. All Rights Reserved. For this game. Hold the finished product with the blades perpendicular and away from the body at shoulder height. © M. more enlightened view is wordier: “The best way is the most profitable way. For each second of additional flight time. blue. or body. Longer flight times are worth quite a bit more money than shorter flight times. each helicopter costs $9 million to build. You may tape the body to give it some rigidity if you like.

Not everyone could hope to be a timer. and Customer (SIPOC) flow diagram. © M. It breaks hours into hundredths of a second. Using Profit Signals 155 Take a moment now to draw a Six Sigma Supply. Output. Boyles. times. the brains behind Corrugated Copter’s success have been. and ore. Process. Input. The quality and cost of that tree affects the quality and cost of your building materials. and money is time. Corrugated Copters is the retail customer who buys it from the wholesale customer. 2003 . It used to be silicon. The market is filled with uncertainty and risk. Accuracy matters. Another ships it as Input to the pulp mill. the calibration of this instrument is exceptionally important. Daniel Sloan and Russell A. The paper began as a seed that was planted on a tree farm in the Pacific Northwestern United States in 1948. The most efficient routes for delivering these devices to your engineers are annotated with dollars. Last and certainly not least. One of your employees has created a Lean flow diagram to show the entire value stream for your watch. The packaged Output is sold to its wholesale Customer. The analysis of that data is yet another specialized task that has its own job classification. You will learn Corrugated Copters is a behemoth that demands global logistical support. All Rights Reserved. and inventory turns. Not everyone is cut out to be a helicopter pilot. the Supply. Since time is money. Complexity surrounds Corrugated Copters. The pulp mill Process creates the paper. The store that supplies this watch keeps a supply of them on hand just in case you need a new one in a hurry. It almost goes without saying that collecting data is a big job. some oil. One company cuts down the tree. The pen or pencil you used to record your measurements also has an informative SIPOC diagram archived for reference in the event another new Six Sigma breakthrough is needed. to varying degrees. educated. Just-In-Time has eliminated almost all of Corrugated Copter’s inventory costs. You and your products are parts of a system. The company’s measuring device is a five-mode wristwatch with alarms.

the mid- management team of Tom. over when and how many times she will say the word “evidence” in a meeting. 2003 . has a Five-Minute PhD. Some employees have heard quite enough of her New Management Equation speech. no one argues with her fundamental point of view. 4. They also know this Six Sigma stuff is a passing fad. Boyles. Avona is often called upon to facilitate meetings.” On this they are in full agreement.” She adds. “I can’t wait to see the rest of your evidence. In any case. “That’s worth ten million dollars in gross revenue. “Ten seconds. Copter teams seem to argue amongst themselves. money is won and lost. What an awesome and terrific flight time!” Avona cheers.” More than anyone else in the company. When Avona joins their dialogue. During a recent productivity breakthrough. Though there is resistance to her methods.156 Using Profit Signals Testing the Current Way of Doing Things Avona Sextant.” “Evidence?” asks the team. a Corrugated Copter senior executive. Daniel Sloan and Russell A. Avona is committed to evidence-based decisions. and 5. © M. teams just naturally converge on answers that lead to a consensus and a “path forward. “The best way is the most profitable way. Some suspect her peculiar predisposition is a genetic disorder. Avona will listen only to stories that have evidence in their punch lines. They are proud of themselves and bragging when Avona walks in. They suspect that Avona’s little formula for calculating Chance variation only works with simple numbers like 3. Bets are routinely placed. That would be a profit of one million dollars. The problem is how to determine which way is best. All Rights Reserved. On this there is a considerable amount of debate. Others think she is crazy like a fox. They are going to wait it out and hope for the best. Dick and Mary produced a double-digit flight time! Just yesterday they booked a record- breaking 10 seconds. When she is not in the room. Some think Avona is goofy.

” “I don’t think that department ever thought of a column of numbers as a vector. I just want to see your other measurements.” She drew the picture in Figure 2 to illustrate the vector analysis of the difference data in Table 1. she learned long ago to translate binary numbers into regular old numbers and back again with the flick of her right index finger. She input the three data points. 8. Her Excel spreadsheet immediately produced the vector analysis displayed in Table 1. she had programmed a worksheet with vector-analysis formulas built into the cells. Using Profit Signals 157 “This is so exciting. You must have flown this machine more than once. Yes it is. Otherwise we won’t make any money in the long run. Plus I also had an abacus for my backup system. people didn’t have to type in any formulas. If we average more than 9 seconds when we launch the product line I will be euphoric. Mary observed. With her templates.9 seconds and 10 seconds. Boyles. Daniel Sloan and Russell A.” said Dick. “Oh. All Rights Reserved.2 Because her abacus was a Chinese rather than a Japanese machine. 2003 . So the first step is to subtract it from the raw data. “This gives us the Differences vector. In the meantime.” said Avona. “I used to use one of those. When Avona first saw vector analysis applied to a data matrix. The new Six Sigma Black Belts in Accounting are changing history!” said Avona. The abacus was the world’s first computing system.” Avona’s aunt in Hong Kong taught her how to use an abacus when she was a little girl. she knew the time had finally come to retire her abacus and her spreadsheet too. “Our objective of 9 seconds is a fixed number rather than a measurement.” she explained. “They do now.” The team showed her all their data: 9 seconds. “Is that what accounting calls a variance?” “Good call Mary. Avona was still waiting for her statistical software purchase order to be approved. I see you’re using a spreadsheet. “I have no idea how © M.

It has one degree of freedom because it is determined by a single number—its average—0. Everything always adds up. Get it? ‘Double entry? Rework entry?’ Well.” © M. Daniel Sloan and Russell A.01. Our Black Belt CPA Peruzzi told me the tip off for her was the word “double”. “That’s how the New Management Equation works. The raw data are flight times in seconds. It’s wonderful. they are going to solve the 1. It is a marvel what Six Sigma education and training can do. This arithmetic is a Law of the Universe.” “See how the squared length of the difference vector. With a properly designed data matrix. the second entry is needless rework. Peruzzi is convinced the entire double entry ‘bookkeeping system’ is nothing more than a massive hidden factory loop. 1. is equal to the sum of the squared lengths 0. “Having those numbers add up is no big deal.” chided Mary. Some have already doubled their personal productivity.74?” she shined (Table 1). All Rights Reserved. See Figure 2. Boyles.” “Oh come on Avona. 9 seconds.3.000 year old waste and rework problems related to the 14th Century’s double entry bookkeeping system.27 and 0. “Our Black Belt CPAs are arraying entries into a data matrix.158 Using Profit Signals ������������� �������� �������� ������ ����������� ������������ ����� ������ ������ ������ ������ ������ � � � ��� ���� ��� � � � ���� � ��� � ���� �� � � ��� ��� ������������������ � � � � � ��������������� �������������������������� ���� � ���� � ���� �������� ����������������� ��������������������� ���� ���� ���� � ������ ������������������������� ����������������� ���� � ������ ��������������������������������������������������������������� ����� ������������������ �������������������������������� ���� Table 1 Vector analysis for testing the current helicopter design against the performance objective. even for a Merchant of Venice. 2003 . Noise is calculated by subtracting the Profit Signal value from the respective value in the difference vector. The profit signal coincides with the average difference from the 9 second objective.

future times would vary. “Oh it is.5 seconds [7. Avona starting talking about evidence. Beamer’s algebra period.” complained Tom.1. Using Profit Signals 159 Figure 2 This is the picture of the key vectors in Table 1.3. Though the team was tired of Avona’s boundless enthusiasm.6)]. “The null hypothesis here is that our future average flight time will be 9 seconds. Boyles.37 Variance—was about 0.5 = 9. with a push of the square root button on their calculators they could see that the sample standard deviation—the square root of the 0. I always have a hard time remembering that a negative number like –0.3 – (3 × 0. minus a positive number like 0.4 in the Noise vector column is. To make matters worse. Daniel Sloan and Russell A. The best performance the team could expect would be that future flight times were unlikely to fall below a lower “three- sigma limit” of about 7. It is!” said Avona. the sign becomes a plus? And look how confusing that –0. Laws of the Universe strike again. 2003 . Their average flight time was about 9. All Rights Reserved.3 turns out to be a bigger negative number!” “The last time I saw this stuff was when I had to learn to use a slide rule in Mrs. Even if these numbers perfectly described what would happen in long-run production.6 seconds. We want to disprove this © M. “Did you notice that when you multiply a minus times a minus.

0. These differences are Table 2 Standards of evidence probably due to Chance.05 gives ‘clear and convincing’ evidence against the null hypothesis (Table 2). “Also. and construction. The best way is the most profitable way. the time could vary up to 10. wind. timing device. We want the average flight time to be higher. This means there is no evidence at all that the average flight time is significantly different from 9 seconds. We will be making money on half.” table.160 Using Profit Signals hypothesis.428. Depending on variations in the weather. just a lot of noise in our system. This means our long-run profit will be zero. “The data will show this is true if and only if the p-value is small enough. paper. By international standards. not even close to the lowest standard. Our p-value is 0. a p-value less than 0. It is a Law of the Universe. pilot. All Rights Reserved.15 gives a ‘preponderance of evidence’ against the null hypothesis. Daniel Sloan and Russell A. A p-value less than 0. $9 million. To illustrate the implications of her conclusion. 2003 . Boyles. if the mean is exactly 9 seconds. our average gross revenue will be exactly equal to our cost.8 and all the way down to 7. © M.” The room was quiet. and losing money on the other half.2 seconds. Avona drew the picture in Figure 3. This is not good. “There is no signal here. We want to make money.

7. For 2500 years the right triangle has shown us the route to profitability. As usual. All Rights Reserved.8 Everyone had taken a liking to Avona’s signal/noise analogy months ago.6 10. Ancient Greek mariners used the sextant to navigate the Mediterranean Sea’s lucrative markets. © M.3 Applying vector analysis to a data matrix on a regular basis is a good way for today’s seekers of truth to learn about Aristotle’s principles. Aristotle equated the right triangle with truth. Ten was an exciting. Using Profit Signals 161 Figure 3 The Normal distribution of flight times if the mean is 9 seconds and the standard deviation is 0.6 0. They all agreed with her interpretation of their data. (2) They might need to go back to the drawing board and find a way to further increase the flight time. Daniel Sloan and Russell A. they needed to know more before they could launch the new product. Boyles. In his little book. the team ended their argument with an agreement. There were two possibilities: (1) The problem might just be the small sample size of 3.6 seconds.4 9 9. Posterior Analytics.2 7. But.8 8. encouraging number.2 10. They could do more tests of the current design to strengthen the signal and reduce the noise. This would let them determine the average flight time with greater accuracy. Six Sigma experts know that the New Management Equation discovered by an old Greek named Pythagoras 2500 years ago is worth billions of dollars today. 2003 .

162 Using Profit Signals In analysis. In our consulting practices over the past 20 years. we have found that some of the people who fear math most are Accountants. these executives and analysts become vital assets for breakthrough project teams. 4 “Some people consider science arrogant—especially when it purports to contradict beliefs of long standing or when it introduces bizarre concepts that seem contradictory to common sense. I must wear glasses to see. shaking the doctrines we have grown to rely upon can be profoundly disturbing. I can no more do math in my head than I can read the letters at the bottom of an eye chart without my glasses. Like an earthquake that rattles our faith in the very ground we’re standing on. just prior to his death in 1996. communications engineers from Marconi in 1901 to Nokia in 2003 have appreciated the value of a high signal-to-noise ratio.”5 The transparent analysis principles in the cornerstone of evidence shake the foundations of business decisions. Financial Analysts. All Rights Reserved. Once on board. Boyles. Executives may find the data matrix and vector analysis distressing.” wrote Carl Sagan. anxiety- © M. Just like Avona. an astronomer and television celebrity. customers want a strong signal. even more daunting obstacle on the high road to evidence-based decisions. I must use a computer to do math. Many corporate officers “did not do well in high school algebra. Confronting math phobia was the most painful. Peiffer’s fourth grade classroom. Until they get the hang of using these tools. and Chief Executive Officers. reversing numbers instead of letters. Math phobia is another. Overcoming Obstacles “Science phobia is contagious. challenging our accustomed beliefs. both concepts tend to terrify cost-accounting analysts. has plagued me since I memorized my times tables in Mrs. The data matrices and vector analyses employed by engineers differ only superficially from the matrix and vectors you used to earn your Five- Minute PhD. 2003 .” Take Daniel Sloan for instance: “Numerical dyslexia. Chief Operating Officers. Chief Financial Officers. as in telecommunications. Controllers. Daniel Sloan and Russell A.

and many other programs are great ways to re-learn the principles of addition. subtraction. Boyles. my stint as a Senior Vice President in a publicly traded. “Way to go. Private tutors and educational consultants are other options that work well. It looks like there might be a genuine difference between the two different helicopter designs.” Avona’s eyes opened wide. 2003 . is the fear of losing one’s job. Math Blaster. “Overcoming my math phobia was a more strenuous challenge than all of my five years as a Vice President of Marketing. division and the order of operations. Using Profit Signals 163 provoking. It can and does persuade executives and line workers alike to face and overcome both these phobias. © M. It has been as rewarding as it has been difficult. Daniel Sloan and Russell A. publishing five peer-reviewed statistical textbooks. Pro-One’s CD-ROM multi-media course Mathematics. money motivates. $500 million corporation. All Rights Reserved. personal. The best news for executives and workers alike is that cheap.” Still not having her statistical program. and founding and running my own business for 14 years. and humiliating career step I ever took. They are fun. “We think we have some evidence you are going to like. Privacy is exceptionally important to adult learning. reliable. Experience shows. multiplication. One of the best things success has given me is the opportunity to help other business leaders like me take that frightening first step forward. downright embarrassing. Just look at this stack of numbers. Six Sigma is a cultural business force that compels people to step up to a difficult task. Comparing Two Ways of Doing Things “Hey Avona!” shouted Tom. she entered the data into one of her spreadsheet templates and showed them the vector analysis in Table 3. Computerized. and very user-friendly software makes vector analysis as easy to learn as sending an E-mail. They are available for adults who suffer from science and math phobias.” Larger than science and math phobias combined. learning programs deliver privacy. Alge-Blaster.

Boyles. The Profit Signal Vector has one degree of freedom because a single number. Just look at my models (Figure 4). but patience was not her long suit. The labeled edges correspond to the vectors in Table 2. I just happen to have a supply in my desk drawer.2 seconds of flight time.” Avona loved evidence. I sure hope the purchase order for my statistical software gets approved soon. Daniel Sloan and Russell A.” Avona complained. 2003 .164 Using Profit Signals “I can’t believe it took me an hour to program this worksheet template so it will act like a data matrix.” © M. The raw data are flight times minus the objective of 9 seconds. Table 3 Vector analysis for comparing two helicopter designs. The average variation for white helicopters is –0. The squared lengths of all the vectors are connected by their part in the New Management Equation (NME). The average variation for pink helicopters is 0.2 seconds of flight time. the minus sign disappears. All Rights Reserved. “So Table 3 is where your ‘cornerstone of evidence’ comes from?” asked Mary. 0. The profit signal consists of the average variation for each design.2. When the numbers in this column are squared. determines it. “Right. We can make a model of your data and our new Analysis of Variance using some bamboo skewers and Sculpey Clay. “What a waste of time.

06. Her office was filled with them. A Polydron regular tetrahedron model is next to a cornerstone of evidence. It is silly for us to use a hand calculator. She told people they were symbolic. “Let’s use my $1 handheld calculator to help us cut the bamboo skewers to length.00.38. Using Profit Signals 165 Figure 4 The cornerstone of evidence represents any vector analysis. Avona played with all sorts of modeling toys. It’s tremendously important in response surface experiments. The noise is so short it will be buried completely in the Sculpey Clay. She would go on and on to anyone who would listen about some artist named Alexander Calder. which equals 2.57 inches. 2003 . We will use inches as the units. Differences in raw data change the dimensions. which equals 2. That’s where we are optimizing over several continuous variables. All Rights Reserved. which equals 0. which equals 0. “I sure wish we had our data matrix software. “We haven’t talked about that last vector in the back of the tetrahedron. This is the vector of hypothetical predicted values. The length of the variation vector is the square root of 0. The length of the raw data vector is the square root of 8. I didn’t include it in my spreadsheet templates because it isn’t important in the type of experiments we’ve been doing. The length of the profit signal vector is the square root of 0.83. which equals 0. Boyles.89 inches.38.24 inches. © M. Daniel Sloan and Russell A. The length of the noise vector is the square root of 0.62 inches. The length of the data average vector is the square root of 8. The profit signal and noise vectors are the fine print in a vector analysis.32. It is a Law of the Universe.

“Say. which equals 2. I think Sculpey Clay is a Six Sigma product.166 Using Profit Signals “Anyway.” “Absolutely right. Daniel Sloan and Russell A.” said Tom tearing his gaze away from Mary’s profit signals radio tower. All Rights Reserved. look at that p-value in the table. we get the prediction vector by adding together the profit signal and data average vectors. I am going to need to bake mine in the break room toaster oven for a few minutes so the clay firms up and holds onto the vector skewers. It gives the predicted average flight times for the two designs. “Could we have hot pink Sculpey Clay points in space instead of green ones?” “We sure can.” said Avona. “There really is a difference between the two designs.01 gives evidence beyond a reasonable doubt against © M.” “Wow.” Table 4 The vector of hypothetical predicted values is the sum of the profit signal and data average vectors.” noted Dick.” Mary hypothesized.88 inches. “It sure is colorful. the length of the prediction vector is the square root of 8. Boyles. A p-value less than 0. looking at their model. In this case.32. It has two degrees freedom because it is determined by two numbers. it even looks like a radio tower sending out profit signals. I just realized if you set one of those up on its end.” (Table 4) It is always just a tad shorter than the raw data vector. 2003 . “The null hypothesis is that there is no difference between the designs.

one degree of freedom for the profit signal and three degrees of freedom for the noise vector and voilà. And while we’re at it. Boyles. See you guys later. that’s another $400. the vector analysis is sensitive enough to detect it.” “Gee whiz Mary. which design works best?” “What is most profitable is best!” Tom. See. Daniel Sloan and Russell A.000 in profit per helicopter. All Rights Reserved. “We even checked them against green helicopters.” “It certainly looks that way. 2003 . “Everyone can see pink helicopters are best.4 seconds.” said Avona.” said Dick. Even though the average difference is only 0. They still came out best. let’s include the green design in the comparison. Why is Avona such a stick- in-the-mud? And why does she keep saying ‘we’ when she really means us?” “Just be grateful she didn’t talk about evidence again. we get the number 0. We don’t have much data on that.9% confident that there is a difference between the pink and white designs. Plus.” said Tom after Avona had left.” © M. This means we can be 99. There is hardly any noise in this data at all. “The spreadsheet actually has a formula called FDIST that calculates the p-value. “Pink helicopters are best. Phenomenal work team! “So.001 from the number 1. We are shredding that straw man like a mogul field at Mount Baker. It was named after Ronald Fisher.999.” Comparing Three Ways of Doing Things “Wow! I think we are onto something with these pink helicopters. Dick and Mary sang out. let’s do a confirmation experiment. It is almost all profit signal! “By subtracting the p-value 0. Using Profit Signals 167 the null hypothesis. you just plug in the F ratio value. “But before we release the pink design to production.

In other words. It has two degrees of freedom because it is determined by two numbers.168 Using Profit Signals “And what is best is most profitable.325.325 seconds longer than the average flight time of 0.200 shorter than average. The p-value of 0. they cost less than a thousand dollars.125 and .9 seconds. pink is best. 2003 . at least one of the designs is significantly different from another. Maybe he will get me two copies of my data matrix software.” Avona’s analysis is presented in Table 5. The profit signal consists of the average variation for each design. Table 5 Vector analysis for comparing three helicopter designs.125 and 0. I want to show it to Rotcev Sisylana. 0. -0. © M. All Rights Reserved. The third number. Shoot. we can see that the pink design flies 0.004 says there is evidence beyond a reasonable doubt that this is false. Daniel Sloan and Russell A.” said Avona.0. I wasted more than that last week dinking around with my spreadsheet templates. Once again. The null hypothesis is that all three designs will have the same average flight time. He’s gonna love this. Which one is best? From the profit signal. “Let’s plug your numbers into my spreadsheet template. is minus the sum of these two. our new CEO from Uzbekistan. Boyles.200 in this case. The white and green design flight times are 0. The raw data are flight times minus the objective of 9 seconds.

Now it’s cubes. Dick. But we are wasting time and money by analyzing only one factor at a time.” Tom and Mary said nothing.” After carefully observing a few flights she noticed something the others had missed. Avona?” asked Tom. Boyles. “Have you noticed that the pink helicopters have longer blades than the white and green ones?” “What?” blurted Tom and Mary. © M. Using Profit Signals 169 After reviewing the results in Table 5 Dick observed. When I got my PhD. “It’s obvious that flight time should depend on blade length. not on color. Let’s do a cube experiment!” “Oh no. paper-clip ballast. I learned that the way to maximize the evidence in an experiment is to study several factors at the same time.” Comparing Eight Ways of Doing Things But Avona was right. and blade length. but they each wondered why Dick had not mentioned this “obvious” thing earlier. “Not completely. Daniel Sloan and Russell A. “Can I see those helicopters first hand? I would love to watch them fly.” added Dick. and Mary. We’ve spent $216 million and we still don’t know anything about our other product features. As shown in Table 6. 2003 . “We never noticed that before! Maybe it’s actually the longer blades that cause the longer flight times. All Rights Reserved. each factor had two levels (settings or choices).” “Thank you Dick. “It’s bad enough when she talks about evidence. “This table looks just like all the others except it’s taller.” “Of course.” Avona responded.” whispered Mary to Tom. “Do we have to start over. The cube experiment they decided to run had three factors: color.

it looks like we have two statistically significant profit signals.170 Using Profit Signals Table 6 The data matrix for the cube experiment run by Avona. we can see that not adding the weight to the helicopter adds 0. this means that not adding the weight to the helicopter increases the average flight time by 0. Anyway. this means that using the long blade instead © M. we can see that using the short blade subtracts 0. Also.028.12 second to the overall average flight time. As a point of comparison. we can see that adding weigh subtracts 0. except it’s wider. “I notice this table is just the same as the others. That’s $240. respectively. Boyles.047 and 0. By looking at the profit signal vector for paper clip (Y).000 additional profit per helicopter sold. she first showed everyone the vector analysis in her spreadsheet template (Table 7). Overall. Tom. “By looking at the profit signal vector for blade length (Z).” “Thank you. 2003 .12 seconds from the overall average flight time. Avona had lost patience with her senior management peers.20 second from the overall average flight time. Daniel Sloan and Russell A. Richard. Also. Dick and Mary. The p-values for paper clip ballast and blade length are 0. All Rights Reserved. we can see that using the long blade adds 0. She had finally purchased her own copy of the statistical software and installed it on her laptop.20 seconds to the overall average flight time. Overall.24 seconds compared to adding the weight.

This means a total of $640.000 additional profit per helicopter sold. “They are code names for the interactive effects among the factors. In this case there were no significant interactions.” Tom asked. “The combined effect of these two changes is an increase of 0. Avona opened her statistical software and clicked her mouse a few times. “Is that why it was OK to just add together the effects of paper clip and blade length?” “Exactly!” Next. Up came the Pareto chart in Figure 5. objective of 9 seconds. But what do XY. “I know that X. Y and Z are code names for the three factors.” Mary asked. All Rights Reserved. That’s $400. Daniel Sloan and Russell A. An interactive effect exists when the effect of one factor depends on the level (choice or setting) of another factor. Boyles. Dick and Mary.The raw seconds. Using Profit Signals 171 Table 7 Vector analysis for the cube experiment run by Avona.000 additional profit per helicopter data are flight times minus the sold. We’ll make millions. Usually there are. 2003 . © M. of the short blade increases the average flight time by 0.40 Tom. XZ and YZ mean?” Avona said.64 seconds.

” said Avona. ���� �������� Everyone can see just by looking ����������� ����� which factors make the biggest ����������� ����� difference. “For heaven’s sakes Avona. blade length. “That is 28 or 256 combinations. paper clip. Comparing 256 Ways of Doing Things “Rotcev wants us to test eight different variables. body length.172 Using Profit Signals Figure 5 Modern statistical software ���� ���������������� ��� � presents analysis results as pictures. Figure 6 shows the data matrix for the experiment. body width. including the flight times that were obtained. Daniel Sloan and Russell A. That would cost us $9 million times 256.” The team built 16 helicopters with different configurations using two different levels of Rotcev’s 8 factors: paper type. They found out customers wanted a quick visual analysis of which factors have the largest effects. Dick. 2003 . © M. Boyles. and body tape. wing tape. aerodynamic folding. All Rights Reserved. ����������������������� ����� ������������ ����� ������������������������ ����� ������������������������ ���� Everyone was taken aback to see Avona use a bar chart.” “Rock on!” shouted Tom. or $2. “But with our data matrix software we can screen all eight factors with only 16 helicopters.” complained Mary.” cried Mary. It’s just that modern software manufacturers are smarter than they used to be.3 billion!” “Good thinking. That would cut our R&D costs by 94 percent. and Mary. “Have you become a bar chart bamboozler?” “Not really.

” observed Dick. Daniel Sloan and Russell A. Boyles. Using Profit Signals 173 Figure 6 Statistical software automatically determines the hyperspace geometry for testing eight different variables simultaneously using only 16 experiments. makes a difference. “It looks like we could be over-engineering our product.” “Very astute thinking Dick. 2003 .” “Roctev needs to meet this team and hear about these results soon. “I think you just figured out a few good ways for us to make more money.” complimented Mary. All Rights Reserved. The software calculated the vector analysis in less time than it took to click the mouse. © M.” said Avona. The Pareto chart ranking the eight factors by strength of signal is shown in Figure 7. Figure 7 Statistical software automatically rank orders each factor according to the size of its Profit Signal strength. Very few of the other factors. “If I read this right. including the expensive paper.

be they complex or simple as pie. Variation surrounds every measurement and measurement system. Daniel Sloan and Russell A. leads to consistently reliable predictions and Six Sigma style profits. you and your colleagues will simply be able to see the answers to problems. weigh yourself on a bathroom scale and record this measurement. Evidence is the length of the profit vector divided by the length of the average noise vector. In this way. 2003 . can be decomposed into these two parts. All Rights Reserved. Why? Everything varies including your weight and the system used to measure it. It makes analysis fast. always. Weigh yourself every hour and keep a running record throughout the day. Evidence. correct. Wait a few moments and weigh yourself again.174 Using Profit Signals Chapter Homework Think of these two elements—profit signals and noise—by using your cell phone as an analogy. Noise. intuitive. There is no need to work equations. The deceptive simplicity of 23 cube arrays makes visual. facts can be seen by anyone at a glance. © M. Chance or random variation is a phenomenon of nature. and statistically significant inferences possible. Noise or static are impossible to decipher. Deliberate analytic speed saves enormous amounts of time. when used to make business decisions. Strong signals are easy to understand. The right triangle vector illustrations in this book show how all measurements. all data sets. Variation is everywhere. The strong signals in our exercise data matrix came from the two factors that influenced the outcome. A data matrix and the rules of a vector analysis sort profit signals from noise. Statistical evidence is a ratio. For example. So long as you stick with the inherent discipline of a data matrix. Boyles. You will discover that your weight may vary by as much as six to 10 pounds per day. Noise has its own vector.

Ballantine Books. Using Profit Signals 175 Closing Arguments Orville Wright. Princeton. 6 Jakab. The Demon Haunted World. How It Works. New York. 1987. Using Designed Experiments to Shrink Health Care Costs. 1997. Page 39. 2003 . Page 328. Visions of a Flying Machine.1968. ASQ Quality Press. Carl. © M. Jesse. Milwaukee. comments on the use of data: “I have myself sometimes found it difficult to let the lines run where they will. instead of running them where I think they ought to go. Daniel Sloan and Russell A. New York. one-half of the team that used The New Management Equation to create the airplane. and How to Perform Mathematical Feats Great and Small.L. Daniel. 4 Sagan. 5 Sagan. Page 32. 1996. Princeton University Press. Boyles. M. 1990. Science as a Candle in the Dark. The Wright Brothers and the Process of Invention. The Demon Haunted World. Science as a Candle in the Dark. 3 A New Aristotle Reader.”6 Endnotes 1 Sloan. St. Ballantine Books. Smithsonian Institution Press. Page 140. All Rights Reserved. The World’s First Computing System: Where It Comes From. Carl. Martin’s Press. Ackrill. Peter L. 2 Dilson. The Abacus. Washington. Edited by J. New York. My conclusion is that it is safest to follow the observations exactly. 1996.

Boyles. 2003 . Daniel Sloan and Russell A.176 Using Profit Signals © M. All Rights Reserved.

Never is it recognized that the one-dimensional “prediction” methods mechanized in spreadsheets and institutionalized by business © M. Boyles. earning. No wonder old school spreadsheet forecasts bear so little relationship to actual business sales. The vector analysis methods for solving prediction problems are known as regression modeling and analysis. inventory. and Rotcev an orientation to basic correlation and regression concepts is in order. By now. you may not be surprised to learn vector analysis is the international standard for making predictions as well as for making comparisons. All Rights Reserved. Avona. “Correlation assesses the tendency of one measure to vary in concert with another. “The invalid assumption that correlation implies causation is probably among the two or three most serious and common errors of human reasoning. What a wonder. revenue. This is good news for Corrugated Copters and your company too. Before returning to the exploits of our Six Sigma breakthrough project heroes Mary. trillions of dollars in corporate and governmental resources are squandered trying to explain prediction errors that are inevitable. Month after month. Daniel Sloan and Russell A. and other performance metrics.” wrote Stephen Jay Gould in The Mismeasure of Man.”1 Experienced managers candidly acknowledge that cost- accounting variance analysis is based on this faulty premise. 2003 . Tom.Chapter 6 Predicting Profits M aking accurate predictions is an important. Dick. difficult task.

2003 . but among these fingerprints are the most cost-effective. Galton wrote. © M. hand. endings and statistically significant patterns. iris. The voice. All Rights Reserved. while he is occupied in some special inquiry. Daniel Sloan and Russell A. this Generalization. reality TV. we did so readers could see he was speaking about a true Law of the Universe. bifurcations.3 This breakthrough soon found its way into courtrooms of judgment and justice around the world. Think Disney. This technology. They are the fingerprints every process leaves behind. The graphic statistical results of vector analysis. the same year the grandfather mentioned in our Premise—the man who used paper bags and arithmetic to cipher out his farm’s business transactions because he didn’t trust the new fangled way of doing things called multiplication—the genius Galton was pioneering the use of fingerprints as forensic evidence. Think prime time.”2 Though Galton did not capitalize the word Generalization in this instance. Most all forensic evidence presented by the entertainment industry in whodunits. Think Spielberg and Lucas. and face can be used in addition to fingerprints.” 5 Think movies. The Generalization of which I am about to speak arose in this way.4 Quoting from Internet sales literature. applied to a data matrix.178 Predicting Profits school curriculums were made obsolete in 1890 by Charles Darwin’s half-cousin Francis Galton. Boyles. are fingerprints. Fingerprint Evidence It is an entertaining and obscure footnote in the history of evidence-based decisions that by 1893. and that his results hold good in previously-unsuspected directions. In his 1890 essay Kinship and Correlation. “Few intellectual pleasures are more keen than those enjoyed by a person who. Each fingerprint data set exhibits unique swirls. “Biometrics is a technology that analyzes human characteristics for security purposes. is the grandchild of Francis Galton’s imagination. suddenly perceives that it admits of a wide Generalization.

science fiction. Profits are too important to be left to the Chance coincidence that a paranormal guess will sometimes be right. and good old-fashioned wishful thinking. This chapter is weighty. Given the stakes of international commerce. and rub it again with our erasers. or criminal investigations—all use the Pythagorean Theorem. c2 = a2 + b2. In 2003 it is too simplistic to satisfy international standards for quantitative analysis. it must have some merit. You know the secret handshake and inside joke. Predicting Profits 179 murder mysteries. past lives. therapeutic vaccine studies. 2003 . We will not. G. For example. belong in a dust bin with auras. A vector analysis applied to a data matrix is a correct analysis. Boyles. All Rights Reserved. soothsaying.6 This evidence adds up. the Genie will appear. it is the mother of white- collar waste and rework. consider the traditional break-even analysis pictured in Figure 1. Serious biometric predictions—be they concerned with acute lymphoblastic leukemia. and gumshoe adventures is based on the New Management Equation. Rune readings. Charter Harrison’s standardized cost model was a step forward in 1918. divining rods. Remember. and rub it. Ever. Data matrix software takes your data and lays it all out for you. If the reading gets a bit heavy for you. Forecasts conjured without the cornerstone of evidence and the New Management Equation. You will never have to do any of these calculations. Tarot cards. Even worse. Three Wishes Cost-accounting variance analysis has been around almost as long as Aladdin’s Lamp. Here are the three wishes: © M. If we rub it. no. Will we not be granted our three wishes? Unfortunately. palmistry. Surely. the weight of evidence we present in this chapter is appropriate. peek at the illustrations. Look for the right triangles. Daniel Sloan and Russell A. Those pictures are our wink at you.

Granting these three wishes would be equivalent to suspending the physical laws of our universe. Boyles. 3. I wish my revenue were exactly a straight-line function of volume. numbers in the second column increase by an exactly proportional amount. This sort of line is a sure sign that shenanigans. This perfect linear relationship produces perfect predictions. Daniel Sloan and Russell A. All Rights Reserved. As numbers in the first column increase. Noise. I wish the relationship between these lines never changed. attends every measurement. rather than standards of evidence. I wish my expenses were exactly a straight-line function of volume. © M. These are plotted in Figure 2. 2.180 Predicting Profits Figure 1 The traditional break-even analysis is a good example of wishful ������� ������ (Averaged thinking in the white-collar work place. 2003 . Even a wide- screen Disney genie would decline this opportunity. Chance or random variation. Table 1 compares and contrasts wishful thinking with reality. Expenses) Dollars ���� ��������������� �� ����������������� ��� ged �� era e) v (A com ���������������� In Product or Service Volume 1. are in use. The mythological Greek Sisyphus had a better chance of rolling his rock to the top of his hill than a manager has of making his monthly results fall exactly on a hypothetical straight line. They get rave reviews in management meetings.

We need to see the profit signal and noise vectors. The ‘real profits’ are actual results. a huge amount of variation. The ‘wish profits’ are hypothetical straight-line predictions. Boyles. Daniel Sloan and Russell A. you guessed it. © M. Figure 3 shows the actual performance numbers on which the linear relationship was based. All Rights Reserved. A single-number prediction is useless without a statement of prediction error based on the degree of variation in the process being predicted. There is. 2003 . Predicting Profits 181 Table 1 Wishful thinking versus reality.

182 Predicting Profits Figure 2 Wishful thinking results falling exactly on the straight-line prediction earn rave reviews in management meetings. Even the best of intentions cannot redeem a patently false premise. They demoralize and debilitate the people assigned to achieve them. 2003 . They are exercises in futility. All Rights Reserved. They waste time and money that might otherwise find its way to the bottom line. We have observed the opposite. Daniel Sloan and Russell A. © M. Boyles. Figure 3 The straight-line predictions were based on real profit data containing a huge amount of variation. Arbitrary goals are products of wishful thinking. A single-number prediction is useless without a statement of prediction error based on the degree of variation in the process being predicted. A persistent leadership “homily” suggests that idealized targets “inspire” superior performance.

We already know blade length is a key control variable. Predicting Profits 183 Prediction Practice “Say Avona. “But why would you want to predict flight time from drop height?” “You know perfectly well why! If we can accurately predict performance we can anticipate the future.” “You know physics. Finally. All Rights Reserved.” “Wait a minute. with a knowing smile. It would be like knowing what the stock market is going to do tomorrow. “I have an idea. “My hypothesis is that drop height could be used as another control variable. but can you be a little more specific?” asked Avona.” said Avona. off-setting change in the drop height. Tom told me it would be difficult and expensive to put tighter controls around our tolerance specifications. We could use that knowledge to make more profits. This would be relatively easy to do. This way. You would probably call it a hypothesis.” said Mary. Boyles. so your hunch has my attention. In problems that involve a relationship between two variables. if we could predict flight time from drop height.” said Avona. “This isn’t the same as the other things you showed us. We’re trying to determine a relationship. Then. “Well. “Let’s say your data looks like this (Table 2).” said Avona. Then I noticed there is quite a bit of variation around our target blade length. Can you give me a preview of how we’re going to do this?” “With pleasure.” said Mary.” “Yes. we could compensate for a variation in blade length by making an opposite. “Make it so. we could hit our advertised flight times with much less variation. We aren’t trying to find the best way of doing something. First I noticed there is quite a bit of variation in our flight times. we have to know which is the © M. Daniel Sloan and Russell A. 2003 . I noticed there is also quite a bit of variation in our drop height. What is most profitable is best.” “That’s a great idea. I have a hunch we might even be able to predict the flight time from the drop height.

The letter Y is used symbolize the dependent variable.” “OK. Daniel Sloan and Russell A. © M. here is the vector analysis from my spreadsheet template for this practice problem (Table 3). At zero I was standing. If we had more data. Also notice that 9. At the minus setting I was sitting in my chair.42 plus 1. Look closely and you can roughly see that the slope of the predicted line in this case is 1. We’re modeling flight time as a linear function of drop height.67 plus 1. we might try something more elaborate. It takes care of everything. Of course. At 1 I got up on my desktop. The letter X is used to symbolize the independent variable.25 equals 10. It’s also easier to draw the picture and set up the spreadsheet template. The dependent variable is the one we want to predict.5 and 11. 9. 0 and 1 are codes for low.42. but what’s this ‘coded drop height’ about?” asked Mary.5.92. but it’s easier to explain if we use the codes. “The independent variable is the one we will use to make the prediction.0 the straight line predictions are 8. medium and high drop heights.67. “The values -1.67 and 10. 9. “Anyway.25. Boyles. “The only difference between this vector analysis and the ones we did before is the way the profit signal vector gets calculated. Notice that 8. which for us is the flight time.92.25 equals 9. 2003 . All Rights Reserved.184 Predicting Profits Table 2 The data matrix array for three flight times paired with three different drop heights. The best-fitting straight line is shown in Figure 4. which for us is the drop height. independent variable and which is the dependent variable. we don’t have to bother with the coding when we use our statistical software. “I know it would look better if we put in the actual drop heights. For our actual flight times 8.

Boyles.25 Y units for every coded X unit. All Rights Reserved. Daniel Sloan and Russell A. We get the profit signal vector by multiplying this slope times the coded X data vector. “Can you tell me whether or not the slope is statistically significant?” Figure 4 Best-fitting straight line for Y as a function of coded X. Predicting Profits 185 Table 3 Vector analysis for fitting Y (flight time) as a linear function of X (drop height). © M. 2003 . “This means the forecast goes up 1.

It does achieve the ‘preponderance of evidence standard’ because the p-value is smaller than 0. 8.” Now Mary had a question.15. Daniel Sloan and Russell A. “OK. All Rights Reserved.5 11. and the profit signal vector is shorter.5 9. “If the slope is larger. but how do I use all this to make a prediction?” © M.” function of X. because the p- value is greater than 0. 2003 .0 Raw Data e Lin n sio es gr Re Coded “Notice that the profit signal vector is parallel to the coded X data vector at the lower left. “It doesn’t achieve the ‘clear and convincing’ standard of evidence.” Figure 5 Picture of the vector “Exactly. Here is the drawing analysis for fitting Y as a linear for this vector analysis (Figure 5). then said.” Avona exclaimed. 1. the relationship between X and Y is stronger.25 in this case.05. This is always true because the profit signal vector is always equal to the coded X data vector multiplied by the slope of the best-fitting line. “Well done.186 Predicting Profits Mary thought for a moment. and the profit signal vector is longer. Boyles. the relationship between X and Y is weaker. If the slope is smaller.

If the coded drop height is 0. they would fall exactly on the straight line in Figure 4. It goes from the point (0.5) = 9.5. “Yes. 0. if we had an X value of 0.67 + 1. “Graphically.0 and 10. 0 and 1 is the same as adding together the Y data average vector and the profit signal vector. and 1.67 + 1.25 × (Coded drop height) “9. The number on that axis is the prediction. Predicting Profits 187 Avona responded.25 × (0. the predicted Y value would be somewhere between 10.” Mary had one final question. “But let’s save that for when you get your real data.67 + 0. go up to the fitted line. All Rights Reserved. For example. 0) up to the point where the noise and profit signal vectors intersect. don’t we have to state the prediction error based on the variation in our data?” “Right again. The result of this is called the predicted Y vector (Table 4).34 “Applying this equation to the coded X values -1.” said Avona. Boyles.” answered Avona. “It’s the vector in the shaded plane labled “Regression Line”. “If you plotted the predicted Y values versus the coded X values.5. 2003 . The statistical software will automatically show you the prediction error. Daniel Sloan and Russell A.5.67 is the average flight time in our practice data set. “If we make a prediction. what we do is start at a coded X value in Figure 4.67 = 10. then over to the raw Y data axis.25 is the slope. “We can be more exact if we are willing to deal with the actual equation of the line: Predicted flight time = 9.” © M. then we get: Predicted flight time = 9.” “Is the predicted Y vector visible in Figure 5?” asked Mary.

I get four times as much work done in a quarter of the time it would take with a spreadsheet. 2003 . We can eliminate 81% of our flight time variation by controlling the drop height. We did five tests at each of three drop heights. Avona. 0 and 1 for the low. The noise standard deviation is 0. medium and high drop heights.” announced Mary as she barged into Avona’s office. “Anyway. I entered my data into the spreadsheet template you gave me (Table 5). about my study. Boyles.056 seconds.” “Yeah.” observed Avona. Daniel Sloan and Russell A. I have my data now. Predicting Real Flight Times “OK. This is astonishing. I’m giving Corrugated Copters much better information and spending more time with my family.188 Predicting Profits Table 4 Predicted Y vector from fitting Y as a linear function of X. so there is a real relationship here.000.288 seconds. beyond a reasonable doubt. I used coded values -1.” “How in the world did you come up with 81 percent?” © M. “I see you bought your own copy of the statistical software. too. All Rights Reserved. The p-value is 0. “The overall standard deviation of the flight times is 0. “Good job.

I think the predictive equation is this: Predicted flight time = 1. I used the last line in Table 5 to puzzle it out.056.288.288. Predicting Profits 189 “That was hard.” admitted Avona. 0. All Rights Reserved. Is that right?” Table 5 Vector analysis for fitting flight time (Y) as a linear function of coded drop height (X). Now check me on this next thing. Boyles. 2003 . The raw data are Mary’s actual flight times minus the objective of 9 seconds.” “Thank you.4 percent of 0. Daniel Sloan and Russell A. “Impressive. The standard deviation of the noise vector is 0.60 + 0. It all added up so I figured that is why you put the last line in your spreadsheet template. I took a lucky guess. Well. See the standard deviation of the variation vector is 0. “Great job.34 × (Coded drop height).056 is 19. Mary. © M.

Your equation is right on the money. the best-fitting line. 1. “Yesterday. “Yes.” answered Avona. All Rights Reserved. and 0. your predictions are more accurate. “The upper limit is approximately two noise standard deviations above the predicted value. and the lower limit is approximately two noise standard deviations below the predicted value. predictions from your equation will be accurate almost to plus or minus one tenth of a second. given the reduction in variation you’ve demonstrated. Not bad. “This shows the data. 2003 . I said I would show you how to use the statistical software to get a picture with predictions and prediction limits. Your noise standard deviation is 0. Daniel Sloan and Russell A. I mean ‘money’ in the literal sense.112.190 Predicting Profits “Yup. Let’s do that now. and the 95% prediction limits. When these limits are narrower. Two times this is 0. with 95% confidence. So. When they are wider. said Avona. Boyles. Mary.056 seconds.” Figure 6 Picture of the best-fitting straight line for predicting flight time (Y) as a linear function of coded drop weight (X).” Avona clicked her mouse two or three times to produce the graph in Figure 6. “You’re batting 1000 today.” “Is there an easy way to calculate the limits?” asked Mary. The 95% prediction limits are approximately two noise standard deviations above and below the line.60 is the overall average of your Y data.34 is the slope of the best-fitting line using coded X data. And. © M. your predictions are less accurate.

Against the Gods. Predicting Profits 191 Closing Arguments Peter L. New York Times. an entire Normal Scheme.com/galton/ start. All Rights Reserved.mugu.mugu.W. starts potentially into existence. like sweet peas. “Commonplace as it seems today. and no risk management system will work very well.’” 8 Endnotes 1 Gould. This quote comes from http: //www. Galton recognized that possibility and warned. Jay Stephen. 2 Galton’s complete works are available from a variety of library and Internet sources. The Mismeasure of Man. 2003 .7 “Forecasting—long denigrated as a waste of time at best and at worst a sin—became an absolute necessity in the course of the seventeenth century for adventuresome entrepreneurs who were willing to take the risk of shaping the future to their own designs. comments on the value of prediction and regression. noted economist. and author of the Business Week. human activities. and USA Today best seller.mugu. whereas. “If nature sometimes fails to regress to the mean. which nearly corresponds to the observed one. the development of business forecasting in the late seventeenth century was a major innovation. ‘An Average is but a solitary fact. Bernstein. economic advisor to nations and multinational companies.html 3 http://www.com/galton/ © M. New York: W.html The origin of this web page is the comprehensive http://www. if a single other fact be added to it. page 272. Daniel Sloan and Russell A.com/galton/essays/1890-1899/galton-1890- nareview-kinship-and-correlation. 1996. Norton & Company. Boyles. will surely experience discontinuities. The Remarkable Story of Risk.

The Remarkable Story of Risk.fujitsu.wvu. 1966. Page 95.pdf 6 http://www. 8 Bernstein. The Remarkable Story of Risk.fme. Page 182. Peter L. Against the Gods. 1966. John Wiley & Sons. New York.192 Predicting Profits 4 http://www.edu/~bknc/BiometricResearchAgenda.pdf 5 http://www. Against the Gods.pdf 7 Bernstein. Daniel Sloan and Russell A. © M. Peter L.com/products/biometric/pdf/Find_ FPS.com/products/biometric/pdf/Find_ FPS.fme. 2003 .fujitsu. All Rights Reserved. John Wiley & Sons. New York. Boyles.

Jobs and the welfare of one’s community are on the line with every significant decision. Every corporate director and senior level executive lives in the world of uncertainty. More than profit is at stake when a manager begins the workday. These responsibilities require a constant demonstration of trustworthiness in economic and personal conduct.Chapter 7 Sustaining Results S tewardship entails honorable conduct in the management of other people’s property. Hart in Organizational Values in America. the Union Carbide plant explosion at Bhopal. Hooker Chemical’s © M. 2003 . it is broader than that. They quantify financial and human risk in ways that can be validated and replicated. But. India. Poor quality management decisions can and do injure our world. Six Sigma products and services are powered by evidence- based decisions. Daniel Sloan and Russell A. Six Sigma performance. The privileges of executive corporate leadership are paired with responsibilities.2 The after effects of the Exxon Valdez wreck in Alaska. The Copper- 7 Intrauterine Device (IUD) and Thalidomide are quintessential medical management mistakes. It also includes respect for the people’s moral responsibilities. Customers confidently bet their lives. Evidence-based decisions reduce uncertainty. Near perfect. on Six Sigma performance. Six Sigma performance engenders trust. They produce near perfect.”1 observed William G. Three Mile Island and Chernobyl are monumental governmental management blunders. Boyles. All Rights Reserved. Scott and David K. and the lives of their loved ones.

These are things to think about as we rejoin Corrugated Copters. Our international relationships build the teamwork we need to compete. All Rights Reserved.194 Sustaining Results Love Canal disaster at Niagara Falls.3. Evidence-based decisions provide a safety net that can help protect us from poor quality judgments. Boyles. As one executive told us recently. Nobel Laureate Richard Feynman used an “Avona” model to demonstrate in no uncertain terms that NASA scientists did not understand the concept of correlation. They know which way is best. Tom. Their challenge now is to hold and gain market share. “Our customers are insisting that we manufacture helicopters with virtually no variation in flight time. Dick. that how they treat a worker sewing a soccer ball in Pakistan affects not only the outcome of the World Cup. Managers around the world now understand.4 It is now widely acknowledged that Challenger might still be flying if NASA managers had applied vector analysis to a data matrix in 1986. Six Sigma has shrunk. our globe. Unfortunately. We emphasize diplomacy both inside and outside the company. in dollars and cents. Daniel Sloan and Russell A. The staff meetings I go to look like the United Nations. all bear witness to the wide ranging effects of poor quality decisions. the sinking of Brazil’s PetroBras platform and its resulting million-gallon oil spill. 2003 . These powerful tools were widely available at the time. They need to sustain their best practices and profits. They told me we have to have a Cpk of 1. Seventeen years ago. and is shrinking. in 2003 it is clear that NASA managers are still basing important decisions on spreadsheet calculations. Mary and Avona have stabilized their flight times. What on earth is a Cpk? Is it an abbreviation for something?” © M. “Our home office is Earth. but also the political stability of the nations where they do business.5 or better. or we are out as their main supplier.” said Mary. Evaluating Practices and Profits “Avona.” This new level of thoughtfulness is a good thing.

” “That level of quality is impossible. I’m not. © M. It is an extremely disciplined way of competing for market share. Let me show you how to calculate a Cpk value from your data.” “No. 2003 . “A Cpk of 1. Boyles. Cpk is a numerical index that quantifies how capable a process is of producing virtually perfect quality output. It is a function of the average and the standard deviation.5 implies no more than 3 or 4 defective products or services per million delivered. It puts an additional spin on these by combining them with the Upper and Lower Specification Limits. “Not at all.5 is an accepted international standard. Daniel Sloan and Russell A. at least for components and sub- processes. A Cpk of 1. In fact.” said Avona. Six Sigma is not just a passing fad. Cpk is based on the New Management Equation. companies that cannot or will not produce that level of quality are getting edged out or even kicked out of the marketplace. All Rights Reserved. Seriously. Sustaining Results 195 “No. “What?” “Just kidding.” chided Avona. it’s just another goofy Six Sigma symbol.” “You’re kidding. This is old news.” said Dick.” “What? Cpk isn’t just the same old New Management Equation?” “Like most other things in Six Sigma. who just wandered into the break room with a fresh steamed latte.” Avona pulled out a piece of paper and wrote the following expressions.

“You could even put a right triangle in it. Boyles.196 Sustaining Results “Cpk is defined to be the smaller of two other numbers called Cpl and Cpu. The average is 2 standard deviations above LSL and 4 standard deviations below USL.” suggested Dick. We’re stuck with it. The average is 2 standard deviations above the Lower Specification Limit (LSL) and 4 standard deviations below the Upper Specification Limit (USL). Cpl = 2/3 = 0.” “Maybe you could draw us a picture.67 © M.67 and the Cpu is 4/3 = 1. Get it? Process? Lower? Capability? Cpl ? Don’t you just hate the way acronyms aren’t even arranged in order? Cpu is the number of standard deviations between the average and the Upper Specification Limit. The right triangle actually gives us a convenient place to put the standard deviation. Roughly 2. Daniel Sloan and Russell A. we want Cpk to be as large as possible. The Cpl is 2/3 = 0.6 (Figure 1). All Rights Reserved.5% of outcomes from this process will fall below the Lower Specification Limit.4 and a standard deviation of 0. divided by 3.2 and an Upper Specification Limit (USL) of 10. “Good idea.33.” “Why are they divided by 3?” asked Mary. Therefore. Cpl is the number of process standard deviations between the process average and the Lower Specification Limit. 2003 . divided by 3. Everyone uses it.67. “It’s an arbitrary convention. Dick. Anyway. I’ll use a bell-shaped curve to represent process variation.8.” Figure 1 A process with Cpk = 0. Here’s a process with an average of 8. This means we want both Cpl and Cpu to be as large as possible. Let’s say we have a Lower Specification Limit (LSL) of 7.

these numbers are the same. This implies that roughly 0. Boyles. “Here’s a process with an average of 9. All Rights Reserved. Roughly 0. Daniel Sloan and Russell A. 0. but in this case since the process is perfectly centered.00 and Cpu = 3/3 = 1.3% of units produced from this process will fall below LSL or above USL.00 and the Cpu is 3/3 = 1. The Cpl is 3/3 = 1. Cpl = 3/3 = 1. 2003 .00.6 (Figure 2).3% Figure 2 A process with Cpk = 1.” “That’s not good.5% of outcomes from this process will fall below LSL.00 Cpk is the smaller of these two numbers.” observed Mary. The average is 3 standard deviations above LSL and 3 standard deviations below USL.33 Cpk is the smaller of these two numbers.67. Avona drew another picture. Therefore. Sustaining Results 197 and Cpu = 4/3 = 1.00.0 and a standard deviation of 0. So Cpk equals 1.00. The average is 3 standard deviations above LSL and 3 standard deviations below USL. This implies that roughly 2. © M.

Cpl = 7/3 = 2.6 and the standard deviation is 0. Boyles. She drew a third picture. 2003 .67 and Cpu = 4/3 = 1. Roughly 32 outcomes per million will fall abve USL.3 (Figure 3).33 This implies that roughly 32 outcomes per million will fall above USL. Daniel Sloan and Russell A.” observed Mary. The Cpl is 8/3 = 2.198 Sustaining Results of outcomes from this process will fall below LSL or above USL.67 and the Cpu is 4/3 = 1.33. Avona drew a fourth picture. Therefore. The average is 8 standard deviations above LSL and 4 standard deviations below USL.33. Cpl = 8/3 = 2.3 and the standard deviation is 0.” said Avona. The average is 9.” Figure 3 A process with Cpk = 1.3 (Figure 4). The average is 7 standard deviations above LSL and 5 standard deviations below USL. All Rights Reserved. “Here’s a process with Cpk = 1. “Here’s a process with Cpk = 1. Therefore. The average is 9. “But still not good enough for Six Sigma.67. The average is 8 standard deviations above LSL and 4 standard deviations below USL.33 and Cpu = 5/3 = 1.” “That’s better.33.67 © M.

a process with that level of capability would produce no more than 2 or so defective outcomes per billion. The average is 7 standard deviations above LSL and 5 standard deviations below USL. it looks like more of them are in the middle of the range than at the extremes. “What a surprise! I’m getting numbers between 2 and 12.” © M. Hmm. 2003 . Dick said. “Then the average would be 6 standard deviations above LSL and six standard deviations below USL.” Process Improvement Simulation “Just for grins.” Mary and Dick groaned. let’s roll some dice. All Rights Reserved. Cpl and Cpu would both be equal to 6/3. Roughly 287 outcomes per billion will fall above USL. “And FYI. right in the center of the specification range?” Figure 4 A process with Cpk = 1. Mary said. “OK.” After a moment of silence. so Cpk would be 2.33 and the Cpu is 5/3 = 1.” said Avona. The Cpl is 7/3 = 2. Sustaining Results 199 This implies that roughly 287 outcomes per billion will fall above USL. “I will record them as you roll. “What would Cpk be if we moved the average to 9. Viva Las Vegas!” After a few rolls. Boyles. Mary answered first.” “Right on.67. now for your quiz.” suggested Avona. which is 2.67. Daniel Sloan and Russell A. “Now that’s something to aspire to.

where did you get these?” “Wizards of the Coast. “In fact. you are playing with a Six Sigma process. wait a minute. Perfect elevens every time.” said Avona.” “It does. 2003 . Figure 5 Seven is the most probable outcome when rolling two dice. I bet that makes Avona happy. this way it looks sort of like a bell-shaped curve. One of my die has a five on every side and the other one has a six on every side.” “Let work out how many ways there are to get each possible outcome with two regular dice. We can use the throwing of two or more © M. Tom chimed in.” “Don’t you always?” “See.” observed Dick. Boyles. Avona. “The average is about 7 and the standard deviation is a little over 2. Daniel Sloan and Russell A. By fixing the dice so you always get the same outcome. it’s a nice segue into what I wanted to show you. “All I get are elevens. while only one combination will produce either a 2 or a 12. What’s up with that? Oh.200 Sustaining Results She entered the numbers into her calculator. The only way you can get a different answer is to write down the wrong number.” answered Avona. “They have lots of games based on probabilities and three dimensional reasoning. “I know a better way to present the outcomes (Figure 6).” said Dick (Figure 5). Six different combinations will produce a seven. All Rights Reserved. “Something must be wrong with my dice.

All Rights Reserved. one of them called out the result.” “20 sounds like an awful lot of defects. “Can’t we make it a lower number?” “Work with me. Daniel Sloan and Russell A. we break even. Boyles. That means our average profit margin is $2 million. let’s get started. and one of them entered the result into their data matrix statistical program. OK. Each die will represent a cause of defects. It’s just a simulation. our Upper Specification Limit is 20 defects. one of them threw the four dice. Let’s say these defects cost an average of $100. (Figure 7). After completing 1000 simulations.” For each simulation. dice to simulate the evolution of process capability through Six Sigma breakthrough projects. Sustaining Results 201 Figure 6 The frequency distribution of outcomes when throwing two dice. If a helicopter has 20 defects. © M. They traded jobs once in a while. they decided they had had enough. Therefore.000 each to repair. Our initial process involves four dice. The sum of the dice will represent the number of defects per helicopter.” said Dick. 2003 . “The average flight time for our current design is 11 seconds. If it has more than 20 defects. we lose money. Avona showed them how to do a process capability analysis with two mouse clicks. Dick. Figure 7 Process capability analysis of the initial “four dice” process.

Cpk is 0.” “You got it. All Rights Reserved.7% of helicopters will be ‘Above USL’.4. Cpk is 1. As you can see. girl. using the calibrated eyeball method. This is an average savings of $400. Boyles.” said Mary. The standard deviation is 3. The number of defects on 831 parts per million (PPM) will be © M. and successfully eliminated one of the top causes. have gone from 14 to about 10.56.” After completing another 1000 simulations. the number of defects on 4. Figure 8 Process capability analysis of the new. “There’s no guarantee we can catch all the defects before a helicopter goes to a customer. It would be better to keep them from happening in the first place. 4.000 per helicopter.202 Sustaining Results “We get only Cpu in this analysis because there is only an upper specification limit. “Let’s assume now that we have data-mined our process data base. Using the bell-shaped curve with ‘Mean’ marked on it. my calibrated eyeball tells me the average number of defects per unit is about 14. 2003 . “Average defects per unit.05. The standard deviation is about 3.” said Avona.7% of them will have more than 20 defects. Our improved process involves only three dice. prioritized the causes of defects. Avona showed them the capability analysis of the improved process (Figure 8). using vector analysis of course. In situations like this. Daniel Sloan and Russell A. Cpu and Cpk are the same thing. and we will lose money. the mean marked on the small distribution curve and the 0-25 scale just below it in Figure 8.” “We might also lose market share. In other words. improved “three dice” process.

“That’s exactly why we can’t afford to be satisfied with a Cpk that is barely over 1. Daniel Sloan and Russell A. Boyles. All Rights Reserved. “But what if the average drifts up over time?” “That’s a very good question. a small one this time. This is an additional savings of $300.” added Tom.5. The number of defects on 78 parts per billion (0. She produced a control chart showing © M.” said Avona. Let’s say we have done that.” said Tom. has gone from 10 to about 7. 2003 .” After they completed another 1000 simulations. and our process now involves only two dice.078 PPM) will be ‘Above USL’. using the calibrated eyeball method. we can detect upward drifts before they cause yield loss. Cpk is 1.000 per helicopter. In other words.” “That’s not all. Sustaining Results 203 ‘Above USL’. we’ll make money on all but one helicopter in a thousand.” noted Avona. we’ll make money on all but one helicopter in 10 million. “Average defects per unit. The standard deviation is about 2. Avona showed them the capability analysis of the new process (Figure 9). We need to eliminate other causes of defects so that upward drifts don’t result in yield loss. Figure 9 Analysis of the ultra-capable “two dice” process.” “Good point. In other words.75. “With an upper three-sigma limit of about 14 as a control limit. She did another simulation.” “This does look a lot better than the old process.

The three-dice units are all above the average. What about the data in relation to the average and upper three-sigma limit?” “Well. Dick.” said Mary. these rules give us operational definitions of when to initiate troubleshooting. Can you see what happened on the chart?” “Yes. the last 10 come from the “three dice” process. Tom. Boyles. I did that myself to distinguish the three-dice units from the two-dice ones. These are signals that something has changed.” “Thank you. Daniel Sloan and Russell A. the first 40 units came from our “two dice” process. We can use the average and the three- sigma limits of the two-dice process to catch an upward drift before it causes any yield loss.” “Also. everyone would have a different interpretation of the same data. Figure 10 The first 40 units come from the “two dice” process. “the two-dice units were evenly distributed around the average. 2003 . “The dots got a lot bigger. “A control chart uses the average and the three-sigma limits to monitor a process over time.” said Tom. “Those are the two most important rules for interpreting a control chart. “Otherwise.204 Sustaining Results what would happen if the causes of defects in the “three dice” process came back (Figure 10). this reminds me of the ‘standards of evidence’ you’re always © M.” said Dick. So what you said before was exactly right. Actually. and none of them were above the upper three sigma limit. Hmm. All Rights Reserved.” explained Avona.” said Avona.” “That’s right. The Upper and Lower Control Limits (UCL and LCL) are the three-sigma limits for the “two dice” process. “In my simulation. The last 10 units came from the “three dice” process. and one of them is above the upper three-sigma limit.

Avona showed the team a table of financial results (Table 1). They try to figure out what went wrong in Quarter 5. “In many companies. “but can we take a break first?” Monitoring Practices and Profits After the break. Daniel Sloan and Russell A. Boyles. They make bar charts (Figure 11).” “OK. Sustaining Results 205 talking about. Table 1 Quarterly financial report (thousands of dollars). Avona. All Rights Reserved. and come up with imaginative explanations. “Allow me to explain.” said Mary. and who to blame. Are control charts related in some way to that?” “Bingo!” exclaimed Avona. 2003 . They try to take credit for Quarter 13. the Executive Committee agonizes over numbers like these every quarter.” © M.

2003 .” answered Avona.206 Sustaining Results Figure 11 Quarterly financial results (thousands of dollars). For example. All Rights Reserved. Daniel Sloan and Russell A. we lose all the information in the monthly numbers.” “To do this right. “but not any more.” said Avona. Avona.” “They did. The profit © M. “Looking only at totals is a big problem with traditional cost accounting variance analysis. He immediately insisted that we apply standards of evidence everywhere. I thought the accountants had their own special ways of doing things. “With vector analysis. not just in manufacturing.” continued Avona. “That was the problem. “Don’t we do the same thing?” asked Mary. “We used to. Looking only at monthly or quarterly totals would lose all the information in the week-to-week variations. “But I didn’t realize you could use it on financial data. “we need to put the numbers into a data matrix (Table 2). we use all the information in whatever data we have.” commented Dick. Then we can apply a vector analysis. The analysis in Table 2 is exactly the same as if we were comparing 15 ways of doing something. Boyles. “If we had weekly data. They could make the numbers say pretty much whatever our previous CEO wanted them to say. we would start with that.” “Rotcev is almost as enthusiastic about this stuff as you are. if we look only at quarterly totals. Not since Rotcev took over.

© M. All Rights Reserved. Daniel Sloan and Russell A. Boyles. 2003 . Sustaining Results 207 Table 2 Data matrix and vector analysis for quarterly review of monthly financial data.

the profit signal variation wasn’t large enough compared to the noise variation. not signals. “We can confirm this visually by plotting the profit signal and noise vectors together on a single graph (Figure 13).” said Avona. © M. The three long vectors are not drawn to (185TION scale. the F ratio basically compares the degree of variability in the profit signal vector to the degree of variability in the noise vector. They are plotted in time sequence. “The solid line is the profit signal vector and the dotted line is the noise vector. (335 NOISE ) 2) IA DA (162) TA VAR AV E RA GE (33 50) IT R OF AL P GN SI (90) “Who can tell me if there are any significant differences among the 15 quarters?” “There aren’t any. “Now. Figure 12 The cornerstone of evidence for the vector analysis in Table 2. We reached our conclusion because the F ratio wasn’t large enough to achieve a standard of evidence. Boyles. It gives the lengths of all the vectors. The apparent quarter-to-quarter changes are just noise.” “That’s right. The overall degrees of variability are about the same.208 Sustaining Results signal vector contains all the information about differences among the 15 quarters. All Rights Reserved. The noise vector contains all the information in the month-to-month variations. In other words. Daniel Sloan and Russell A. The numbers in parentheses RAW DATA (3356) are the lengths of the vectors.792. “Here is the cornerstone of evidence for this vector analysis (Figure 12). “The p-value is 0. 2003 . It doesn’t meet any standard of evidence.” Mary quickly replied.

Sustaining Results 209

Figure 13 The numbers in the profit
signal vector are plotted as the solid
line. Each of these is an average of
3 numbers in the variation vector.
To make the visual comparison
statistically valid, the numbers in the
noise vector were first divided by the
square root of 3. Don’t blame us, it’s
a Law of the Universe. The adjusted
noise numbers are plotted as the
dotted line.

“Creating a graphical comparison like this is a little trickier
than it looks. To make the comparison statistically valid, I had
to divide the numbers in the noise vector by the square root
of 3. This is because each number in the profit signal vector is
an average of 3 numbers in the variation vector. I know this is
confusing, but it’s a Law of the Universe.

“Anyway, in 1924 a man named Walter Shewhart was trying
to come up with a good graphical method for analyzing data
over time when there is a natural or logical way of grouping
the data. For example, we grouped our raw monthly data by
calendar quarters. That made sense because the Executive
Committee reviews it on a quarterly basis. Shewhart called
these rational sub-groups.5

“Instead of plotting the profit signal vector and adjusted
noise vector on top of each other, Shewhart decided it would
be better to plot just the profit signal numbers, and use
horizontal lines to represent the upper and lower three-sigma
limits of the adjusted noise numbers. Also, he decided to add
the data average vector to the profit signal and noise vectors.
He felt this would be easier to interpret.

“In other words, he invented what we now call the X-bar
chart control (Figure 14).

“This control chart tells us the same thing as the F ratio: the
quarter-to-quarter changes are just noise.”

“I have a question,” said Dick. “Do we have to do the vector
analysis all over again every quarter?”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

210 Sustaining Results

Figure 14 The dots are the averages
of the monthly revenues in each
quarter, not the totals. The centerline
is the grand average of all the
monthly numbers. The Upper Control
Limit (UCL) is 3 noise standard
deviations above the average.
The Lower Control Limit (LCL) is 3
noise standard deviations below the
average.

“That’s a good question,” answered Avona. “Fortunately, the
answer is no. Once we have a good baseline, like we have in
this example, we hold the control limits constant and just plot
the new numbers as time goes by.”

“I guess the fundamental things you’ve taught us really do
apply,” said Dick.

“OK, here’s your quiz. What are two ‘events’ on this chart that
would indicate a real change of some kind?”

“A point outside the control limits,” said Tom.

“A bunch of points in a row above the center line,” said Mary.

“Right on both counts,” said Avona. “Remember, Mary, it
could also be a bunch of points below the center line. And, by
the way, the usual requirement is eight in a row for a statistical
signal.”6

“This is great stuff,” said Mary. “But I’ve been wondering:
aren’t we still losing some of the information in the month-to-
month changes?”

“Excellent point,” said Avona. “Shewhart was aware of this
problem. His solution was to plot the standard deviations of
the subgroups on their own control chart (Figure 15). The
two charts together give us a complete picture of what’s going
on over time.”

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Sustaining Results 211

Figure 15 The dots are the standard
deviations of the monthly revenues
in each quarter. The centerline is the
average of the standard deviations.
The Upper and Lower Control Limits
(UCL and LCL) are three-sigma limits
based on the standard deviation of
the standard deviations. Strange, but
true.

Taking Action

After the others left, Avona realized there was another basic
fact about control charts that she needed to teach them. It
wasn’t about how to set up the charts, or how to interpret
them. She felt that was pretty easy.

She knew from experience that control charts were all too
often used as “window dressing”. Maybe “wallpaper” is a
better analogy. In manufacturing at least, she knew that
control charts add real value only when they are used as a
basis for action.

She also knew that reacting to control chart signals was a
process, just like any other business activity. In order to add
value, the reaction process must be defined and documented.
It must be improved over time.

She had found the tools of Process Mapping to be ideal for
these tasks. In her experience, it worked best to have teams
of operators, supervisors, maintenance technicians, engineers
and managers develop the reaction plans together. She had a
reaction plan “skeleton” she always used to get them started
(Figure 16).

The question “Signal?” refers to one or more pre-defined
signals on one or more control charts. The charts and signals
are defined by the team that develops the plan. The term
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

212 Sustaining Results

Figure 16 A generic reaction plan
“skeleton” for a manufacturing or
service process.

“escalate” means to raise the level of the investigation by
bringing in someone with greater expertise. Ideally, the
manufacturing or service process is stopped until “Continue”
is reached. Figure 17 shows an actual example of a reaction
plan for a lot-based manufacturing process.

Figure 17 An example of a reaction
plan for a manufacturing process.

In this example, the team decided to confirm a control chart
signal by immediately taking a second sample from the
same lot. If the second sample does not show a signal, the

© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

Sustaining Results 213
occurrence is documented and the lot moves on to the next
operation.

If the second sample does show a control chart signal, the
manufacturing process is put on hold while the Operator
goes through a pre-determined checklist. The checklists in a
reaction plan are determined by the team that develops the
plan. That is why it is so important that all vocations are
represented on the team: operator, supervisor, maintenance
technician, engineer, and managers.

If the operator solves the problem, the occurrence is
documented and the lot moves on to the next operation.
Otherwise, the supervisor is called in. It may be necessary to
bring in the engineer, or the maintenance technician, or even
the manager. The important point is that the manufacturing
process remains on hold until one of two things happen:

1. The problem is solved.

2. Someone of sufficiently high authority makes the
decision to resume manufacturing while the problem is
being worked on.

The keys to the success of reaction plans are:

(a) Orderly and consistent evidence-based response to
problems as they occur.

(b) Visibility of problems throughout the
organization, appropriate to their level of severity.

(c) Evidence-based decisions made at the appropriate
levels of responsibility throughout the organization.

A disciplined approach like this is a bitter pill at first.
Supervisors and managers object to the loss of production
time. After a few weeks or months, the same supervisors
and managers are singing the praises of their reaction plans.
Invariably, they have seen their unplanned downtime
plummet. Problems are being fixed right away, instead of
being ignored until they become catastrophes.

These short-term economic benefits are overshadowed by
long-term improvements in process capability. The old
© M. Daniel Sloan and Russell A. Boyles, All Rights Reserved, 2003

and does it revolve around the sun? Does multiplication work? Do gravity and electricity exist? Do airplanes fly? Can you buy things on a computer and have them delivered to your door? Are vectors and hyperspace real? All the evidence we have says “yes”. and. You are asking yourself. Closing Arguments “The idea of control involves action for the purpose of achieving a desired result.edu/artsci/ecn/mead/306a/Tuftegifs/ Tufte3. D. Organizational Values in America. Corrigan. but that’s it.C. Daniel Sloan and Russell A. Building a Safer Health System. One day we wake up and.ca/~stat231/stat231_ 01_02/w02/section3/fi4..uri.html 4 http://www.uwaterloo. Hart. 1991. the impossible has happened.math. refining it as we learn new things.” But we keep following our reaction plan. It’s impossible to get any better.7 Endnotes 1 Scott. 2003 . 2001. Linda T.. 2 Committee on Quality of Healthcare in America. National Academy Press.” Walter A. We think. To Err is Human.214 Sustaining Results “four dice” process gives way to the “three dice” process.student.com/ old/space/feynman-report. 3 http://www. Molla S. Transaction Publishers. Washington.ralentz. Editors. Statistical Method from the Viewpoint of Quality Control. Shewhart. Janet M. Page 139.html © M. We find ourselves with an ultra-capable “two dice” process. “That’s great. to our great astonishment. New Brunswick. David K. William G. Kohn.pdf and http://www. Boyles. We find our competitors using us as the benchmark.4. 1939. All Rights Reserved. is this really possible? Was Pythagoras right about right triangles? Is the earth spherical. Donaldson.

Economic Control of Quality of Manufactured Product. 7 Shewhart. 2003 . Statistical Quality Control Handbook. 1986. 6 AT& T Technologies. Inc. Boyles. 1984. Republished in 1980 by American Society for Quality Control. Sustaining Results 215 5 Shewhart. Statistical Method from the Viewpoint of Quality Control. Dover Publications. Copyright renewed by AT&T Technologies. Inc. Walter A. New York. 1931. Walter A. Van Norstrand Company. New York. Inc. copyright 1956 by Western Electric. © M. Daniel Sloan and Russell A. All Rights Reserved. D.

Boyles. 2003 .216 Sustaining Results © M. All Rights Reserved. Daniel Sloan and Russell A.

All Rights Reserved. “Liberty cannot be preserved without a general knowledge among the people. “We thought that on this subject. I accordingly prepared three bills for the Revisal. proposing three distinct grades of education. wRiting. or the dictates of our passions.1 Trust. and whatever may be our wishes. but to all of us in his 1821 autobiography. Knowledge and skill necessarily change the nature of authority. Boyles. “Education should be on the spot.” This was a bold proposal.2 American history provides an excellent road map for redefining the 3 Rs—Reading.”3 Adams and his colleagues were as passionate about intellectual liberty as they were about freedom. Those of us who enjoy the rare combined privilege of Untied States © M.Chapter 8 The Three Rs E ducation and training are the first steps in building an organization founded on evidence-based decisions and the New Management Equation.”4 Jefferson’s friend and ghostwriter got. characteristically. a systematical plan of general education should be proposed. John Adams wrestled with evidence as we all do. and I was requested to undertake it. and aRithmetic—to Reading. 2003 . our inclinations. Daniel Sloan and Russell A.” Thomas Jefferson wrote not only to Adams. teaching all classes. “Facts are stubborn things. and vectoR analysis. to the point. they cannot alter the state of facts and evidence. and the best method… I call for the education of one million and thirty thousand children. decency and respect replace fear and favor as social adhesives. wRiting.

“Numbers make me nervous. Six Sigma’s Hidden Factory “I have been thinking about what you have taught us Avona. if I couldn’t make myself look better than everyone else at the annual review.” Roctev. Avona. Two secretaries and a million-dollar advertising budget. I drew a flow diagram yesterday. All Rights Reserved. 2003 . I had it all. That darned picture kept me awake all night long.” Everyone stared at Roctev. “Anyway.” said Dick. Four phone lines. Paine’s good deal can yield even better bottom line business results. Tom. Then I say dumb things I wish I could take back.218 The Three Rs citizenship and an American public education can thank Thomas Paine. and Mary looked at Dick’s map (Figure 1). We predict that in the new millennium. Boyles. Before that. I was driving my wife crazy tossing around. Then I get embarrassed. So I got up at 3 AM and came into work. “Obviously I had a hard time understanding those spreadsheet tables. Daniel Sloan and Russell A. “Thanks for not making fun of me when my lights were out. Heck.5 Paine’s outlandishly impractical investment scheme turned out to be the bargain of the millennium. Avona. Tom and Mary listened. “You are one brave guy. I began getting uncomfortable in 1986.” said Dick. They were smiling. my light finally went on. I would have had a real problem. “The last time an employee told me I was full of baloney was when I was a Vice President of Marketing. “I can see exactly what you mean by my comfort zone.” complimented Roctev.” Roctev the CEO. Once you showed me the Pareto chart that rank-ordered the factors. Big office. Big desk. © M.

proposals can make people so uncomfortable that they would not the University of Southern California. The Three Rs 219 Figure 1 The hidden factory of traditional Six Sigma. Daniel Sloan and Russell A. Projects “One day I was working a night shift to ‘get close to my are delayed and deferred. I remember that finance or data analysis. She was none too rather waste money than upset pleased to have me pay her a social visit. the status quo. not employees. All Rights Reserved. “So. I believe she was 19.” “You know. I tried to keep up the conversation by expressing an interest. I because of cost accounting. What do you mean exactly?” “Well I read that bull poop memo about your recent management decision.’ I sat down next to a clerk. and since the lobby was vacant at one AM. after about 30 minutes of chit chat. do remember she was a sophomore at USC. 2003 . she told me she had homework to do. do you. She shut her mathematics text and looked right at me. Boyles. “Ummm.” she said. Bold because USC stood for the University of Southern Colorado. You know the one with all the numbers?” © M. you guys in senior management don’t have a clue.

“Whew. ‘This change scares the heck out of me.” “Wow. I did not sleep well.” “She pulled out a piece of graph paper. Our layoff plan to save money and our $11 million senior management data analysis on the supposed need for a massive building program just got an F. Boyles.’” Roctev looked at Dick. All Rights Reserved.’” “Here. She drew me a picture. 2003 . My comfort zone is the problem that stops projects. “So. anybody who has taken Statistics 101 at USC can tell you don’t even know how to do an Analysis of Variance. maybe the other 1. and a pencil. Just for the sake of the argument you thought we would have. let’s say your map is true. a calculator. I do. I did decide to confront my math phobia. COO and me were making. I learned how to draw control charts. What should I do? What would you do?” © M. As CEO.220 The Three Rs “Yes.” “I had taken Statistics 101 as a foreign exchange student at Baldwin-Wallace College in 1969. But. you say we have this gigantic. “I figured if a 19-year-old could see through the faulty reasoning behind the decisions our CEO. Instead of a 19- year-old kid with spunk.” “Well. Now I am a CEO. Six Sigma hidden factory of rework. “I thought you’d go seriously supersonic if you thought I was saying you blame accounting and finance instead of stepping up to the plate and just saying. Many of them still are. I am the reason for Six Sigma project rework. Please show me what you mean.000 employees could too. I thanked her and excused myself. a ruler. They all got promoted.” said Dick. I will show you. so I said. I have a top-flight team. CFO. Daniel Sloan and Russell A. My colleagues chose to keep on bamboozling. ‘Well actually I did take that class but I got a C in it and that was a gift. “One thing led to another. Her whole show took about five minutes. I never did learn what Analysis of Variance meant. You.” said Mary.

“Dick. simulations. everybody can contribute. We need experts and good teachers. I think we need everybody. You are a teacher. “Let me get this straight Dick. It would be less costly. her models. “You know. All this Black Belt and Green Belt stuff is overhead.” “Let’s go get a latte. Whatever. Roctev nodded his head. Avona chuckled quietly and looked at her shoelaces.” said Tom with a stern look on his face. Everybody has a brain. But people respect you and Mary because of what you know and do. wRiting. Let’s call the way we work literacy and be done with it. The Three Rs 221 “I would start phasing out the Six Sigma bureaucracy.” (See Figure 2) “Literacy?” “Yes. We could just call Six Sigma ‘literacy’. All Rights Reserved. chai and biscotti. Daniel Sloan and Russell A. Boyles. “You are an expert.” “What!” cried Tom and Mary who had just mounted and framed their Black Belt certificates. wRiting. “I’m buying. not because of your numbered certificates.” suggested Dick. “Are you saying that Black Belts aren’t needed?” “No. and the software have made Six Sigma so simple. Let’s hire people who are literate. you just earned your American Society for Quality Six Sigma Black Belt certification. Do you mean to tell me that you are proposing to give it all up for the good of the company?” asked Roctev.” © M. and vectoR analysis. “Six Sigma is simply the use of evidence-based decisions. and more effective if we called our Six Sigma program the Three Rs. We could have some fun with it. I even used a cube to outline the three factor interaction.” replied Dick. That idea is as old as Aristotle. or who want to be literate. Reading. 2003 . “All I am saying is. “Maybe it could be Reading. All the people we work with have imaginations.” said Avona. No. Avona.” “Huh?” said Mary. and Refraction.

But there is a cost. Daniel Sloan and Russell A. All Rights Reserved. palpation by Magnetic Resonance Imaging. © M. the cost accounting variance analysis by the Analysis of Variance. auscultation by ultrasound. steam by electricity. by wireless communication. Boyles. by railways. See the investment of capital in aqueducts. “New arts destroy the old.222 The Three Rs Figure 2 Literacy now refers to people who know how to read. poisonous purple foxglove seed remedies by quality controlled digitalis. by steam. spreadsheets by the data matrix software. fortifications by gunpowder. made useless by hydraulics. It is far less costly than the alternatives. typewriters by computer keyboards. 2003 . roads and canals. sails. vectoR analysis wRiting Reading Our Proposal A global workforce that is literate in the Three Rs of the New Management Equation is an excellent value. transistors by silicon chips. telegrams. write and vector-analyze numbers.” wrote Ralph Waldo Emerson.6 His observations ring true as we watch vacuum tubes made almost useless by transistors.

’—Is it so bad then to be misunderstood? Pythagoras was misunderstood. When did this happen? Do we look as funky as a judo gi and as old as a Six Sigma acronym? © M. Motorola’s Six Sigma business initiative was designed at a time when a dual 5. Boyles. ruler. “With consistency a great soul has nothing to do. 9600 Baud was a fast connection. Can it be our own college students were babies in the eighties? Great Caesar’s Ghost! We are old men.7 “A foolish consistency is the hobgoblin of little minds. Our Les Paul. All Rights Reserved. 2003 .25-inch floppy disk drive IBM computer with an amber screen was an executive luxury. the Chief Executive Officer of Northwest Hospital in Seattle at that time. Daniel Sloan and Russell A. so you are sure to be misunderstood. Speak what you think in hard words and tomorrow speak what to-morrow thinks in hard words again. Harvard Graphics bar charts on a dot matrix printer were breakthrough technology. There are bald spots on the back of our heads.”8 Each age. The Three Rs 223 In 1992. adored by little statesmen and philosophers and divines. and Jesus. Sometimes you just get lucky.” Though none of us were able to articulate a proposal for improving the cost accounting variance analysis back then— that being a vector analysis applied to a data matrix—we came to discover that the rest of Emerson’s quote was prophetic. and Socrates. and Galileo. and Copernicus. To be great is to be misunderstood. How can this be? We have grey hair. the Internet and Windows 95 were new. while grappling with the hand calculations. and Xeroxed chart template required to produce a statistical process control chart. and Luther. as Emerson pointed out. Stratocaster and PRS guitars and Cry Baby Wah-Wah pedals are antiques for sale on EBay. He may well concern himself with a shadow on the wall. The books of an older generation will not fit ours. though it contradict every thing you said to-day— ‘Ah. criticized cost accounting by quoting Emerson. His daring Total Quality Management (TQM) observation is intrepid today. and Newton and every pure and wise spirit who ever took flesh. must write its own books. When General Electric got a hold of Six Sigma. pencil. Time flies. We are wearing progressive lens glasses.

Alfred A. We must negotiate and we. Success Stories in Lowering Health Care Costs by Improving Health Care Quality. We must learn. Collected Writings. The Life and Selected Writings of Thomas Jefferson. New York. Library of America. Circles from The Portable Emerson. 1992. Pages 87-97.com/b/broughsbooks/history/articles/ john_adams_quotations. All Rights Reserved. 2 Wood. must get to Yes together.9 Endnotes 1 Wood. Jodi B. New York. Thomas.htm 4 Jeffereson. Milwaukee.224 The Three Rs Yes. unlearn and relearn how to do things. New York. The Radicalism of the American Revolution. edited by Carl Bode in collaboration with Malcolm Cowley. Gordon S. Alfred A. Random House. William. 1981. 8 Emerson. ASQ Quality Press. We must work. The Radicalism of the American Revolution. 3 http://www. Page 48. Knoph. page 189. Penguin Books. and work very hard. Negotiating Agreement Without Giving In. Ralph Waldo. Thomas. Getting to Yes. Daniel Sloan and Russell A. page 189. Self-Reliance. 9 Fisher. and Ury. Pages 630-633. We must try to age gracefully. Ralph Waldo. We must change with the times.dropbears. edited by Adrienne Koch and William Peden. 2003 . © M. 5 Paine. Boyles. Penguin Books. M. 1992. Knoph. 1944. to stay young. Page 229. 1995. Daniel and Torpey. Gordon S. New York. Roger. all of us. 7 Sloan. We do. 6 Emerson. New York. New York.

DMAIC – This is an acronym for Design. Each of the four faces is a generalized right triangle. Boyles. Analyze. Each row is an object. Each column is one of the variables we have measured or observed. All Rights Reserved. where n is the number of rows in the data matrix. A column in a data matrix is a vector. which is the Six Sigma project cycle.Appendices I. Vector Analysis And Evidence-based Decisions ANOVA – Acronym for Analysis of Variance. It is a point in n-dimensional space. Improve and Control. Factor – A controlled variable in a designed experiment. Cornerstone of Evidence – This is a generalized tetrahedron representing a vector analysis. Data vector – A stack of numbers or labels treated as a single entity. Fisher’s general term for the various forms of vector analysis he developed. It is a measure of the strength of evidence in the data against the null hypothesis. Daniel Sloan and Russell A. 2003 . Confidence level – Obtained by subtracting the p-value from the number 1 then multiplying by 100. The six sides or edges represent the raw data vector and the five possible vector components of variation that can be broken out of any set of raw data. Glossary of Terms: Data Matrix. © M. Data matrix – An array of numbers or labels in rows and columns. Measure. entity or event for which we have collected data.

© M.05 gives ‘clear and convincing’ evidence against the null hypothesis. New Management Equation – Our name for the Pythagorean Theorem. Profit signal vector – Same as profit signal. Vector analysis – The process of breaking up a raw data vector into perpendicular vector components of variation. manufacturing. All Rights Reserved. A p-value less than 0. Vector – An arrow that defines magnitude and direction. c2 = a2+ b2. random. forward corner of the tetrahedron. connecting one point in space with another. A p-value less than 0. P-value – The probability of getting. Tetrahedron – A three-dimensional figure with four triangular faces and six edges. normal. statistical variation found everywhere in Nature. common. Noise – The chance.15 gives a ‘preponderance of evidence’ against the null hypothesis. Profit SignalTM – Quantifies and rank orders which factors impact any business. Daniel Sloan and Russell A.01 gives evidence ‘beyond a reasonable doubt’ against the null hypothesis. A statistic proportional to the ratio of squared length of the profit signal vector to the squared length of the noise vector. 2003 . by chance alone. or service process. It is the vector at the bottom right-hand. A p-value less than 0. an F ratio as large as the one we got. Pythagorean Theorem – The square of the long side of a right triangle is equal to the sum of the squares of the other two sides. The ratio of the length of this vector to the length of the noise vector in a correct analysis yields the F ratio that measures the strength of evidence.226 Appendices F ratio – A measure of the strength of evidence in the data against the null hypothesis. Boyles. It is a Law of the Universe.

and Politics are essential. Franklin. Hume’s ideas. The Business Bookshelf 1. Newton. Einstein introduces the idea that “the evolution of an empirical science is a continuous process of induction. liberty. the pursuit of happiness. 1917. The circular logic of knowing. He provides a complete listing of applied © M. the Pythagorean Theorem. Appendices 227 II. They communicated. tackles the complexity of quality. Daniel Sloan and Russell A. writing. and mathematics were integral to their lives. science. relationships. and other American revolutionaries were Hume’s contemporaries. the sequential order of perceptions. and thinking reflect the British personality in applied science. Bacon. and a good social order. Eudemian Ethics. and modality are dealt with in one difficult. and Test Hypothesis. Eudemian Ethics details the links between a respect for the individual. These texts outline the sequential.” Dr. the quality of judgments. and the Cartesian coordinate system. 2003 . Study. The Declaration of Independence. knowledge. All Rights Reserved. emphasizes the importance of sequential perceptions. 1781. measurement. Fisher. Boyles. Aristotle’s cycle is the foundation for all science and Walter Shewhart’s original Plan. challenging text. 3. and Box are names that can be culturally linked to Hume’s work. Einstein specifically honored this work as an inspirational force in his work. Albert Einstein’s little book. Do. the pursuit of happiness. Inductive/ Deductive cycle of the scientific method: Hypothesis. Darwin. Adams. Deduction. Immanuel Kant’s Critique of Pure Reason. and analysis are landmarks. 5. Action. Act cycle and M. The study of philosophy. Posterior Analytics suggests that the triangle signifies truth. Nightingale. Experiment. 2. Washington. Daniel Sloan’s IDEA Cycle: Induction. Relativity. This classic document addresses life. Evaluation. 4. virtue. and a good social order. Einstein’s ideas on time. 1739. He specifically describes his use of probability. justice. 1776. Aristotle’s books on Logic. David Hume’s A Treatise of Human Understanding. Jefferson.

Descartes. “Inductive inference is the only process known to us by which essentially new knowledge comes into the world. Clarence Irving Lewis. Fisher’s 1915 Biometrika.” The importance of experimental observations must be connected to the “precise. The Pythagorean Theorem or New Management Equation is a Generalization. This book details the practical application of a circular. (A good resource for finding them is Collected Papers. 1935. Walter A. Daniel Sloan and Russell A.” explains the logarithmic transformation of the correlation coeffient r that leads to a near normal distribution. 6. 2003 . Mind and the World Order. Hume. and Kant. This book inspired Walter Shewhart. Edwards Deming for the extension of the z Table to the 0. Fisher’s 1921 Metron paper. inductive and deductive logic cycle. Boyles. Statistical Methods for Research Workers. © M. Gauss. The Design of Experiments. It applies to samples of any size. 1971-1974). 1931. is a phenomenal. 1929. The thirteenth edition credits W. deductive reasoning” of Euclidean geometry. Ronald A. 8. Galileo.H. At age 25. The University of Adelaide. “Frequency Distribution of the Values of the Correlation Coefficient in Samples from an Indefinitely Large Population” introduces the idea of using geometry to represent statistical samples. 10. 7. The philosophy of conceptual pragmatism led to the development of the field of Six Sigma quality improvement. paper entitled. This book includes illustrations and ideas from Fisher’s work and his own unique perspective on the importance of sequential data analysis.228 Appendices science thought leaders: Euclid. Induction precedes deduction.1 level of accuracy. “On the Probable Error of a Coefficient of Correlation Deduced from a Small Sample. Ed. Bennet. All Rights Reserved. He presented a table tabulating the transformation for each value of r. Volumes 1-5 J. seminal work. Shewhart’s Economic Control of Quality of Manufactured Product. 1924. Kepler. Fisher’s works from 1913-1935. Outline of a Theory of Knowledge.

This is a master work of applied science. and judicial. Here are fourteen points to ponder for a good social order in the workplace to ponder. “On a Classification of the Problems of Statistical Inference. gives Deming’s vision of a quality controlled health care system. Number 218. Appendices 229 “The Nature and Origin of Standards of Quality. 9. Daniel Sloan and Russell A. George Box. An Introduction to Design. Induction precedes deduction. 2003 . The first chapter addresses the primary importance of the design of an experiment. The pictures R. Stuart Hunter. Data Analysis. Test Hypothesis (Judicial). and Fisher. William G. 1978. All Rights Reserved. is absent from this work. is a noteworthy historical book. It is curious to note that Deming’s 1951 understanding of the importance of a designed experiment and the economy/ geometry of sample. Out of the Crisis. 11. This article shows the futility of using random samples for analyzing a dynamic process. Edwards Deming. and J. On the “Distinction between Enumerative and AnalyticSurveys. Statistical Method from the Viewpoint of Quality Control. June 1953. Experiment (Executive in character). It was directly affected by Fisher and Shewhart’s work. and Model Building. 1950. Pages 44 and 45 contain the graphic illustration of a continuous improvement cycle: Hypothesis (Legislative in nature).” June 1942. This is a best-of-class book on systems thinking. Hunter. 10. 1986. 1968. Edwards Deming’s Some Theory on Sampling. executive. comes directly from Some Theory on Sampling. Clarence Irving Lewis. Fisher imagined are drawn. Deming cites the influence of Shewhart. is taken from a series of US Agricultural Department lectures delivered at the invitation of W.” was written in 1935 and was published in the January 1958 issue of The Bell System Technical Journal. © M. Statistics for Experimenters. Ludwig von Bertalanffy’s General System Theory. His PDCA improvement cycle dates to Aristotle. Boyles.A.” The American Statistical Association Journal. W. Volume 37. 1939. Deming details the geometry of sample variances on page 62. He describes the character of the continuous improvement cycle as legislative.

This is the definitive 20th Century work on the Big Bamboozle. and problems. The Pythagorean Theorem provides sound theory for all standartd statistical theory. Fisher’s main point. Roger Fisher and William Ury. Getting to Yes. is the only therapy model and/or psychological theory we know of that was developed using probability theory. deShazer formally opposes a focus on defects. inductive reasoning. 13. Rather one should focus on solutions and doing more of what works. How to Lie With Statistics. Negotiating Agreement without Giving In. is hidden from view. defectives. 1988. and flow diagrams. All Rights Reserved.230 Appendices Many of the important algebraic expressions Fisher wrote are translated. 2003 . Darrell Huff. © M. 14. 12. Mr. Daniel Sloan and Russell A. One of the book’s essential main points. Boyles. Somehow Fisher’s ideas are simplified. Induction precedes deduction. This is the handbook for teaching people how to bring Six Sigma breakthroughs to fruition. Steve deShazer’s Clues: Investigating Solutions in Brief Therapy. This model for rapid improvement works well in systems of any size.

Master Black Belt. (1954). A. Learning to See Lean Value Stream Mapping work book http://www.cfm?Sele ctedProductID=9 6. 2003 . Engineering Statistics Handbook.itl. Evidence-Based Decisions. Six Sigma Black Belt/ Expert 16 Class Curriculum Outline Black Belt Core Study Texts and Free On-Line Resource book 1. Getting to Yes. Hunter and Hunter. http://www. Statistics for Experimenters. Boyles PhD. The Getting to Yes Workbook by Roger Fisher and Danny Ertel. W. Getting Ready to Negotiate. Milwaukee. Wisconsin. 2. by Jack Botermans. Bedrock Classics: 1.org/Lean/Bookstore/ProductDetails. 2. All Rights Reserved. Optional Show Stopper: Paper Flight.gov/div898/ handbook/index. (1991) ISBN 0-14-015735-2 3. How Evidence-based Decisions Power Six Sigma Breakthroughs. How to Lie With Statistics. (1932) ASQ Quality Press. Box. Complete. Appendices 231 III. Free PDF download on-line Internet Resource. Shewhart.htm 7. Economic Control of Quality of Manufactured Product. Evidence-based Decisions.lean. (1995) ISBN 0-14-023531-0 4.nist. ISBN 0- 393-31072 5.. Inc. Daniel Sloan and Russell A. Darrell Huff. Negotiating Agreement Without Giving In by Roger Fisher and William Ury. Profit Signals. 2003. (1984) ISBN 0-8050-0500-5 Must Read.(1978) ISBN 0-471-09315-7 © M. easy to follow instructions for making 48 different models that fly. Inc. Daniel Sloan and Russell A. Sloan Consulting and Westview Analytics. Boyles. M.

Daniel Sloan and Russell A. is also available. It is capable of handling virtually all of the analysis work required in Six Sigma breakthrough projects. Another application. Measure.minitab. is a requirement for printing. Quality America’s Excel SPC-IV add-in. note taking and electronic file attachments. We believe this vector analysis program is the best in class. 2003 . Six Sigma leaders must know how to use Excel and its add-ins. Crystal Ball by Decisioneering. Course content covers the American Society for Quality’s (ASQ) Six Sigma Body of Knowledge and uses Bloom’s taxonomy of knowledge: Knowledge Black Belt Experts must be able to recognize terminology. This tool can be an excellent guide for project selection.com/ Ease-of-use and a short learning curve makes this program desirable for some executive champions. http:// www. Microsoft Excel. Boyles. Portable Document Format (PDF). Our course is distinguished by the speed with which Black Belt candidates produce bottom line business results. Therefore.com/index. The rigor and relevance of the course content are structured around the proven Six Sigma DMAIC cycle: Define. definitions. the de facto standard.com This multi-variate. JMP 5. Improve and Control. ideas. principles and methods. Our course is published for students using Adobe Acrobat 5.0.jmpdiscovery.com/ We happily accommodate customers who prefer this excellent application. http:// qualityamerica. Minitab. financial simulation tool is superb for enlisting and retaining finance leader support. Analyze. © M.232 Appendices Software Recommendations: Superior software is essential to breakthrough improvements and bottom line business results.html As of September 2003. http://decisioneering. All Rights Reserved.0 http://www. reading.

Calculate the Standard Deviation: s and sigma. Inductive and Deductive reasoning.1.2. Defining Six Sigma: Introduction. Regression. Black Belt Course Outline 1. 1. Introductions. 2003 .3. Scatter Diagrams. 1. 1. (or Minitab 13) software navigation are introduced.Recognize that the mode and median exist 1.5. Overview. Statistical reasoning. Correlation.5. 6 Sigma Analysis. Calculate the Mean. σ. 1. diagrams. Daniel Sloan and Russell A.3.4. Calculate Improbability – The F ratio © M.0. The 5-Minute PhD: Vector analysis applied to a data matrix. and concepts on the job. analysis. Analysis Experts must be able to break down data and information.1.5. Appendices 233 Comprehension Experts must be able to understand tables. reports. All Rights Reserved. Evaluation Black Belt Experts must be able to make judgments regarding the value of proposed ideas and solutions. methods. 1. the ANOVA. Learning Objectives: Theory and practice. Synthesis Experts must expose unseen and informative patterns. Pareto charts. 1. The Complete Six Sigma Tool Kit: Categorical Catapult Experiment: 23 Designed Experiment (DOE). Histograms. Control Charts. and computing literacy are key.5. Four Essentials in a thorough. Application Experts must be able to apply principles. Boyles.2. and directions. and History – A Six Sigma Gestalt 1. JMP 5.

1.5.8. philosophy. Experiment. Deduction.4.3.7.M. Brainstorming 1.10.10. projects 1. Analogy (1931-2003): Legal System Decisions 1.6.R. 1.10. X2…. Outputs and Customers 1. Process. Do. Measurable. The Six Sigma Lucrative Projects Results Map 1. goals and models. 1.5. Boyles. PDSA or PDCA: Plan. 1. Check.6.4. Designed Experiment Homework.2.7.7. Analogy: Management System Decisions 1. In class demonstrations are mandatory. Daniel Sloan and Russell A.2.A. 1.6.1 Y = f (X1.Xn) 1.6. Control. Every class participant will complete his or her first breakthrough project this evening. The IDEA cycle: Induction.3. 2003 . Test Hypothesis.1.1.1.8. Graph Data in meaningful ways that illustrate the mean. DMAIC : Define.2.6.7. Measure. Relevant.11. Selecting and Leveraging Projects 1. S. Lucrative Project Selection 1. 1.6.3. 1. Where you want to be in your future? 1. An Enterprise View: Suppliers. standard deviation and probability information.3.234 Appendices 1.T.10. Vector Analysis applied to a Data Matrix 1. Inputs. Specific. Study.7.1. Act or Plan. Where you are today? 1. © M.10.4.2.10. All Rights Reserved. Analyze. Achievable. Results will be recorded and analyzed using JMP or Minitab for class presentations during in class 2. Project Charters and Planning Tools Gantt and Performance Evaluation and Review Technique (PERT) Charts 1. Do. and Time Bounded.9. Evaluation and Action. Improve.10. The scientific method: Hypothesis. Calculating the Priority Projects Using Excel matrix 1.7. Interactive Dialogue: Assessing evidence in your corporate culture. Standards of Evidence: Evidence-based Profitability Principles. Six Sigma: History. Act 1.3.

2.2. Daniel Sloan and Russell A.2.4. Executive 2.3. Hands on corporate example and demonstration © M.1. “Kickin’ the heck out of variation. How to Lie with Statistics reading assignments and discussion. The Continuous Catapult Experiment: 23 Designed Experiment (DOE) 2.3.3. 2.1. 2003 . Futura Apartments Tutorial 2.7.1. Opportunities 2. 2.5.4. Black Belts 2. Define: Organizational Responsibilities and Financial Six Sigma.1. Both are easy to master. Closeda nd Open Loop Feedback Systems 2.Old Equation Comparison 2. Six Sigma Language.2.1.1. Leadership and Job Descriptions.3.6. Champions 2. SWOT analysis of Sub-optimizing systems.5. Master Black Belts 2.2. Accuracy and Precision. Linking Organizational Goals and Objectives to Six Sigma 2.4. Typically these are spread out through the entire day. Crystal Ball budget building Decisioneering Tutorial Review 2.4. By the end of the day people have memorized software keystrokes for either Minitab or JMP.5. Boyles.2. Getting to Yes workbook reports. 2.3. They are reliable as the sunrise and sunset.5.4. Predicting the future with the Profiler. Threats 2.5.7.7.3.3. Both yield identical answers.5. Vision Research Tutorial 2. 2. Homework Experiment Presentations using software 2.4.7.1. 2.5.3.2. All Rights Reserved.5. 2.4.4.3. 2. 2. Strengths 2. Appendices 235 2.4. Weaknesses 2.” led to the martial arts metaphor.Profit Signals workshop to Include Chief Financial Officers and/or Controllers 2. Class dialogue on cultural norms and issues related to this topic. Green Belts 2.5.7. Learning Objectives 2.1.3. Designed experimentation demonstrations from home.1. The New Management Equation .3.1 What is different about 6 Sigma and other problem solving tools? 2.3.

3.6. and evidence do not speak for themselves. easy to follow instructions for making 48 different models that fly.6. Executive Team Presentations 2. and Customer) 3. Design. Complete.4. 3.R. Predicting the Future with categorical and continuous variables.6. 3. Boyles.4. Inputs. analysis. 3.2. Homework. 2.. Analogies. Jack Botermans. and Analysis.2.5.1. Dialogue discussion. Profiler: Optimization and Desirability 3. we cover the entire list of recommended tools.T. 3.6.1. Iterative learning and fun. Learning Objectives 3. Consequently..Paper Airplane Homework Presentations 3.1. Visualize and plan your breakthrough project presentation.3. Project Documentation: Data. Process. Brainstorm and draw one.2.5.A. Spreadsheets 2.1.8.5. and fly paper airplanes according to your experimental array with your team.2.1 A Complete Six Sigma Pilot Project – Synectic Experiment Paper Flight.5. The 5 Whys 3. 3.5. 3. © M.8. Project Timelines 3. Daniel Sloan and Russell A.1. Build Relationships and BATNA 3.2.3.8.1. Defining Six Sigma Project Selection and Benchmarking 3.236 Appendices 2.2. 2003 . 3. Outputs. (1984) ISBN 0-8050-0500-5 3.2. Story Boards 2. Begin building a Crystal Ball model related to potential Six Sigma projects. Practical Applications.8.8.1.1.2. Projects and the SIPOC diagram. 3. there were no questions related to vector analysis or the data matrix.5. Debriefing. Efficient. Statistical Software Application practice.1. A Catapult 25 DMAIC Experiment.3.M.4.2. SIPOC Diagrams (Supplier. 3. 3. Negotiation techniques for Success: Getting to Yes.1.3.4. The American Society for Quality’s Black Belt test is discussed. Wise.8. Management Reviews 2.2.3 10 Principles for Getting to Yes 3. All Rights Reserved. 3.1. Phased Reviewed 2.1.2.2. Intuitive and counter-intuitive solutions. build. S. Six Sigma is a Business Initiative NOT a quality initiative. As of 2003.8.

3. 3.4. 3. Histograms © M.5.3.7.2. Measurement – Performance Metrics and Documentation 3.1. Operational Definitions – Critical to Quality Characteristics 4. Internal Best Practices using the complete Six Sigma tool kit. 3. Voice of the Customer (VOC) 3.6.3.8. Comparing Machines.8.2.2.3. Flow Diagramming the Production Process 4. Measure.3.8. Defining .8. 3.9.2. Review of Homework and Reading 4. Defining Process and System Capabilities 4. Relevant. 3. Frame your reports accordingly.3. 2003 .10.8.9. 3.8. Hands-on Define.1. Project Charters and Paper Work. Textbook DMAIC breakthrough Case Study presentation.4. and Time Bounded. Improvement 3. Boyles. Control 3.2.3. Sorting.3.1.7.0 Software Application 4. Populations versus Samples 4.1. 3. Product Tear Downs and published books.3. 3.11. Practice with JMP 5. Independent Evaluations and public Financial Reports. Specific. Analysis: Mean. and Shifts 3.8. 4. Ground Rules for Nominal Group Technique 3. Learning Objectives 4.8.2.8. Process Characterization and Optimization. 4. Plant Visits and interviews.8. Production Lines.8. Brainstorming Critical To Quality Flight Standards 3. Achievable.1. Literature Searches: Internet and Company. Benchmarking – Process Elements and Boundaries.7. Plants.5. Project homework and reading assignments set. All Rights Reserved. Appendices 237 3.8. DMAIC is what your customers expect to see. 3. Graph 3. Probability.7.7.10.Design for Six Sigma 3. Standard Deviation. Repetition for mastery using candy M&M Sampling. Measurable.2.7.8.6. and Analyzing.3. The Complete Six Sigma Tool Kit: Vector Analysis Applied to a Data Matrix. Daniel Sloan and Russell A. and Analyze Experiments. Sampling our population of candy.

Learning Objectives: Vector Analysis applied to a data matrix and Evidence-based decisions.1. and Data Mining Training 5. Pareto Charts 4. Standard Deviation. Understanding the context of multiple variables is the key to breakthrough improvement projects.4. Control Charts 4. JMP 5. Calculate and graph Cpk for select individual copters.1.6. The DMAIC Breakthrough Chart 4. 4. Defects Per Unit 4. 4. 4.10.7.5.8.1.9. Compare M&Ms Enumerative Sampling with two-level. 4.4. Daniel Sloan and Russell A. Project selection updates including Crystal Ball model. Define: Negotiation. 2003 .5.3.3. Boyles. 4.1.7. 23 Designed Experiment: Comparing the value of systematic observation with simple arithmetic counts.8. proof reading example.8.3.238 Appendices 4. Quality Function Deployment.2. and dialogue. Calculate and graph Cpk for all 16 copters. 4.10.1. Six Sigma Values. 5.7. Present results of tools applied in daily work. Helicopter 23 Confirmation Experiment © M.7. Observe Designed Experiments DMAIC Demonstrations by Students 5. Juran’s Trilogy 4. 5. 5.2.1. Mean. 4. Motorola’s classic.2. Probability. Emphasis of key concept. Homework reports.2. 4. Homework: Read Quality Function Deployment white papers for report.3. Confidence Interval introduction.4.9.7.1. and Graphed Results. 2 Helicopter Designed Experiment 8 4. Calculating Defects per Million (DPU) Opportunities 4. Cp and Cpk 4. presentations.9. Project Selection Focused Homework on Process Capability 4. Scatter Diagrams and Correlation Coefficients 4. All Rights Reserved.3. Shewhart’s P-Chart 4. 5.9.5.4.6.2.0 or Minitab Calculation Practice 4.3. Practical Applications using Dice 4.1. eight factor DOE analytic sampling.10.2.3. 4.

Diffusion of Innovation 5. design for test.5.10. Excel Data Sorting Function – A brief history of data mining. design for maintainability.5.2.9. How does this analogy apply to your work? 5.6. 5.7. 5. The Four Houses of Quality.1. The Hows 5. Aristotle. Adoption Process 5. 23 DOE Data Mining Demonstration and practice. KANO Model of Quality 5. 5.7. 5. Daniel Sloan and Russell A.A One proven method of encouraging concurrent engineering.3.5.6.6. Homogeneous Fields and Records. Understanding and overcoming Road blocks 5.6.1. All Rights Reserved.1. The Whats 5. Building a House of Quality . 2003 . Negotiation – Getting to YES.6. Change Agent Methods 5. 5.5. “Correlation matrix” Trade Offs 5.6.8.5.2.2.5.5.6. standard deviation. Orthogonal Arrays 5.6. © M.8.5. Homework: Outline project selections for class presentation.5. Change Agents and Team Leadership: Pythagoras.8. to Frederick Douglas and Harriett Tubman to 2004. Design for X (DFX): Design Constraints. and analytic graph.5.7.1.2. design for manufacturability.7. Cultural Influences 5. Iterations and efficient learning. Boyles.4.4.8. Appendices 239 5.6. 5. Communication 5. 5.7. 5. Columns and Rows 5. Motivation 5. 5.1.7. The Four Phases.3.3.6.4.4.4. Bring data in spreadsheet formatted for data (sorting) mining practice.5.6. Using and Excel Template 5. 5. Innovation Adoption Model 5.7.5.5. Functional Requirements and Robust Design 5. A correct vector analysis: Thorough 6 Sigma Analysis: The average. probability. Force Field Analysis – Forces Fighting Change 5.

12. Surveys: Telephone.11. 6. Sorting. Daniel Sloan and Russell A.1.6. quality. Brainstorming 6.12.12. Critical to Quality Tree 6.9. Homework presentations and review of data mining strategy. 6.12.4.5.8. Drawing the Value Stream . Learning Objectives 6.3.4.12. Relevant to business in financial.2. 6. Uncovering the “Hidden Factory” 6. Cause and Effect Diagrams 6.7.1. Customer needs. 6. DMAIC Comprehensive definition process to determine variables and outcomes.12. Interactive role playing using a game of historical significance.2.2. Identifying Critical to Quality Characteristics (CTQ) 6.3. 2003 . Prevention Costs 6.4. Measuring Value: Rolled Throughput Yield Metrics. Excel Spreadsheet template available. Collecting.1. © M.1. Poisson Computer Simulation on Defects per Unit 6.1. Costs of Poor Quality 6.7.3.2. All Rights Reserved. Developing and Translating Customer Information 6. & Costs of Quality 6. Observe the machine. Affinity Diagram Experiment 6.3.10.3.3. Thought Process Mapping 6. Boyles.8. Detailed walk through of an exemplary Cost of Quality corporate report.6.Lean Flow Charting Fundamentals 6. Drivers.4.4. 6. Drive Down Costs Red Bead Sampling Game – Drive Down Costs 6. The 24 Quincunx Machine Experiment for JMP or Minitab practice.3.4.1. 6. External Failures 6. Universal Standards of Measurement.. Central Limit Theorem Simulations using the machine and Decisioneering’s computerized demonstration model. interview 6.7.7. 6.240 Appendices 6.1. 6.5.1.6. Rolled Throughput Yield (RTY) 6. Appraisal Costs 6.1. mailing. and productivity terms. Categorical Thinking 6.2.3. Internal Failures 6.2. Quantified CTQ 6.

4.9.2.1.13. A Primary Objective 7.2. Systems Thinking 7.12.2. Techniques for process mapping 7.7.8. What is this common process? 7.3.5.3. Document design rules 7.3.7.10. Define the process 7.4. Appendices 241 6. 2003 .14.13. Why be concerned with information? 7. Learning Objectives 7.4.2.3.12.9. Quality Cost Statement by Product Line 6. Systems and Processes. Process Boundaries 7.3. What is your purpose? 7.4.2. The mapping method 7. Outline for process definition 7. Workshop Purpose and Agenda 7.1. 7. Process Customers 7.2.6.4.4.5. Phillip Crosby’s Rule of 3 6. Measure: Process Mapping 7.2.3.5. Process Model (SIPOC) 7. Documenting Processes 7.11.3.4.2. Why document a process? 7. Why use process maps (flow diagrams)? 7.4.6.4. Why use a process model? 7.4.2.2. A documentation survey tool 7.4.3. What makes a process reliable? 7.3. A Process Definition Tool 7.6. Introduction 7.1. Boyles.1.4. Homework Reports 7. Goals of Process Design 7.1.1. Structure your information 7.12.3.2 Process and System Concepts 7. What is a parallel process? © M.7.4.2. Daniel Sloan and Russell A. Homework Focus on Quality Costs Project Results 7. Process Categories 7.2.6. Process model revisited 7.5.5.3.2.1.4. Exercise in process definition 7. Flow charting the primary process 7. Balance document needs 7.2. Definitions 7. “Global Process Requirements 7. Taguchi Loss Function Example 6.4.7.10. All Rights Reserved. Define the Process 7. What is process mapping? 7.12.8.3.4.1.4.

8.6.5. The decision question 7.4. Using alternate formats for process mapping 7.4.12.7.23.4. rework.5.1.16.18.6.5.6.9. All Rights Reserved.4. Writing good narrative 7.6.4. Finish the flow chart 7.1.10.1 Workshop purpose and agenda © M. Using maps to improve and streamline processes 7. Boyles. Technique #6: Analyze inputs 7.4.29.5. PERT chart 7.5.13.6.5.7.24.6.6.5.4. reverse loops.4.5.28.2.3.25. Technique #1: Value assessment 7. Elimination targets: waste.4.6. A process analysis tool.242 Appendices 7. 2003 .27.1.4.15. Daniel Sloan and Russell A. Example 7. Exercise: Alternative paths 7.7.6.22.6.4. Exercise: Primary Process 7. Technique #2: Standardize 7. Top-down flow chart 7. Other useful symbols 7. Technique #5: Prevention 7.5.5.5. Data flow diagram 7.4. Introduction 8.5.4. 7. 7. Key implementation points 7.4. Exercise: Define responsibilities 7. and needless complexity.6.4.1.8. Measure: The Productive Team Member 8. Example 7. Add control points 7. Responsibility matrix 7. Technique #4: Early control 7. Decision tree 7. Controls: Some considerations 7.5.6.2.9.17.10. Goals of process analysis 7.3. Homework 8. Exercise: Control Points 7. Flow charting alternative paths 7.26.19. Characteristics of a good flow chart 7. Technique #3: Using the map 7.4. Exercise: Remap 7.5. Cross-functional flow chart 7.21. Adopt and use standard symbols 7.20. Simple flow chart 7. Types of maps 7.5. Example 7. Geography flow diagram 7.14.4.11. Standardized Process Chart 7. delays.

2.13.8 (Murder Mystery exercise) 8.2.17. Change vs.2. Norms and Team Development 8.2.9. Improving Team Performance 8.4.9.2.2.9. Building “I-Statements” 8.2. Stages of Team Development 8.16. Boyles.9. Signs of Team Trouble 8.9. All Rights Reserved. Strategies for Managing Change 8.2.2.5.14.2.7. Overcoming Hindrances to Team Performance 8. Circle In The Square exercise 8. Close © M.2.2. Barriers to Good Listening 8.2.8.7.9.9.18.5.9.11.9.2. Learning objectives 8.10.9.2.9. Competition versus Cooperation 8.2.20. The Four Room Apartment 8. Team Self-Evaluation 8. How To Correct Bad Listening Habits 8. Communication Breakdown 8.2.2.9.9.2.2.2.1.2.1.9.2. Internal Forces For Change 8.9.17. Box Of Stuff exercise 8.4.2. 2003 . Transition 8.9.9. Learning Style Inventory 8. Member role and responsibilities 8. Sense Of Urgency—Good Or Bad? 8. Practicing Feedback 8.9 Four Stages of Team Development 8.2.15. A Sample “Code of Cooperation” 8. Types of Feedback 8.11.9.16.9.3. Five Approaches To Getting Unstuck 8. Inputs for a Successful Team 8.2.1.2.9. Appendices 243 8.15.1.9.9. Team Roles and Responsibilities Meeting Management and Leader skills 8.2.2.2.9.13. External Forces For Change 8. Communication Model 8.2.2. Homework report (six sigma project progress) 8. Characteristics of Effective Teams 8. Ground Rules for Consensus 8.3. Outputs of a Successful Team 8.9.3.2.9.8. Teams vs Groups 8.2.9.14.2.2.9.9.2. The Johari Window 8. Improving Communication 8.6.9. Principles of Large-System Change 2.12.19.18.12. Daniel Sloan and Russell A.6.2.2.9.19.2.2.2. Groupthink 8.2.9. What things the Team must Manage 8.

9.1. interval.4.244 Appendices 8. Collecting. 10.com/ Read as much as you can prior to the next class.5.fmeca. Stem and leaf diagram © M.4.2. categorical. Real-world examples 10.2. Homework: Present project progress and estimated dollar savings using tools.9. History.2. Ordinal 10. 10.3. 9.2.1.1. Traditional taxonomy 10. Explain central limit theorem using coin tosses. Review process capability concepts with working exercise. Use software to explore data basesdatabases.3. Review: Types of Data 10.1.4.2. Assign Homework (Six Sigma Project Progress using appropriate tools) Visit http://www. Criticality is included and emphasized. 10.2.3. Learning Objectives 10. pass/fail.2.3.3.3. 2003 . 10. Time to failure = life data 10.6.2.2.3. ordinal. Fit a normal distribution to measurement data and assess goodness of fit. and Predicting using data. Nominal.4. All Rights Reserved. 10. 9. Summarizing. Walk through of the entire FMEA process will include group work tools and methods introduced in effective team member class. 10. 9. Homework Review – Focus on Project Financial Results 10. Use correct graphics to summarize measurement data.4.3.2. ratio 10. Review output of product.5. Analyze: Exploring. 9. Boyles.5. and life data. 10. Definitions and Acronyms.3.2. Recording and Analyzing Measurement Data 10. Continuous = measurement = parameter = variable 10. Measuring the Process: Failure Mode Effects Analysis FMEA Workshop. Be able to give examples of continuous. 9. More useful modern taxonomy 10.1.2.5. 10.1.1.2.3.2.1. Review of graphics for measurement data 10. Daniel Sloan and Russell A. Attribute = categorical = discrete = nominal 10.2.2.1. count.3.

10. maximum and range 10.5. Process Sampling 11.2.4.0.6. Homework Review Focused on Project Results 11. Central Limit Theorem.5. Express real-world problems in terms of statistical models and population parameters.2.1.1. Identify default statistical models for measurement. Cumulative or “rolled throughput” yield (Review and Reinforcement) 10.8. Review of descriptive statistics for measurement data 10. Workshop: Wooden sticks. coin tosses and Process Capability 10. Measurement objectives © M.8.1. Generating descriptive statistics and graphics 10.0.2.8. Population sampling 11.6.7.7. 11.4.8.3.4.2.2.2. Frequency histogram and Cumulative Distribution Function (CDF) 10. 11. and worker variation.3. Homework Focused on Project Results 11. Gauge Reproducibility and Repeatability Studies and Practice 10. count and life data.4. Measurement Systems 11. Appendices 245 10. Entering data in rows and columns 10. standard deviation. Explain relationships between processes and populations. calipers. Plus and minus three standard deviations 10.1.0.2. Minimum.3. and Power Point 10.3.2.9.3.3.3.2.6. data entry. Learning Objectives 11.8. Definition of a measurement system 11. Yield calculations for one-sided specs 10. Boyles.2. three sigma limits.1. Daniel Sloan and Russell A.7. JMP Data Exploration Exercises 10. 2003 . Analyze: Inductive Reasoning Part 1 – Quantifying uncertainty in measurement systems (Formerly known as Hypothesis Testing) 11.7.3.3. Boxplots 10. All Rights Reserved.2. capability indices. Mean and standard deviation 10.6. Use Confidence Intervals to characterize or test a process in terms of mean. 11.1. pass/fail. Producing a report: Integrating with Microsoft Word.0. Yield calculations for two-sided specs 10. 11.3. Excel.3. fraction defective or reliability.6.

Homework Review Focus on Project Results 12.1.6. Recognize statistical problems when they occur. 11. Choose appropriate test procedures based on type of problem and type of data. binomial.1. The role of calibration procedures 11.4. and be able to classify them as testing an objective.4 1.2. Accuracy and precision (Review and Reinforcement) 11.3. and relating variables. Identify appropriate null hypotheses for testing an objective.6.5. The law of likelihood and likelihood function. Hyper-geometric. Analyze – Inductive Reasoning Part II 12. Statistical Hypotheses and Process Hypotheses 12.2.5. The Normal. Use p-values to interpret the results of statistical tests. Pass/Fail 11.5.2. All Rights Reserved.4.5.2.2.3. 12.2 Reproducibility: dependability of gauge operators and environment 11.2.5. Explain the difference between correlation and regression.4.5.4.3.6. 11.7.1.6. 12. Homework Focused on Project Results 12.4. Chi Square and t distributions 11. 12.3. The “one-sided” fallacy. comparing processes. Daniel Sloan and Russell A. or relating variables.4.4.2.2.1. Statistical Inference 11. Confidence and Evidence 11. The Seven Habits of Highly Statistical People: Quantifying Uncertainty.5. The null hypothesis.8. Fair coin tosses. 2003 . 12.3. Boyles. Characterizing and Testing exercises 11.4. 11.4. 12. comparing processes.4.2. P-values © M. Interpreting Opinion Polls 11. Interval Estimation 11.1 Repeatability: dependability of the gauge 11.4. Examples and exercises: Calibration and Calibration Control: Penny for your Thoughts workshop exercise.3. Sample Size Calculations 11. Poisson.2. Repeatability and Reproducibility (R&R) 11. 11.4.3.1.4. Measurement Uncertainty 11.3. 12.5. 12.246 Appendices 11.6. Learning Objectives 12.3. and Weibull distribution models 11.

13.1. Relating variables.4.3. 2003 . and Linear Regression 13.2. z test for equality of two Binomial proportions (valid only for large sample sizes) 13.3.1.1. Sample Size Calculations 13. Power of detection 13.4.6.4.2. Continuous Measurements 13.6.3.4. Degrees of Freedom 12. Test for equality of two or more Poisson means (valid only for large sample sizes) 13.3.3.2. Mathematical definition 13. Daniel Sloan and Russell A.3.2.5.5.2. All Rights Reserved. Likelihood ratio test for equality of two or more Poisson means 13.4.4.2. Appendices 247 12. Smallest difference of practical significance 13.1. Hypothesis testing revisited 13.1.2.5.6.2. Homework and Six Sigma Project Progress Review 13. Test for equality of two or more Binomial proportions (valid only for large sample sizes) 13. Likelihood ratio 12.1. P-values 13. P Values from the F statistic 12. P values from Z statistics 12.6. Number of Defects 13.3.2. Pass-Fail Data 13.2. Example: comparing two opinion polls 13. z test for equality of two Poisson means (valid only for large sample sizes) 13.2.5.4.4.5.1. 12. Likelihood ratio test for equality of two or more Binomial proportions 13.2.3.3.4.2.5.3. Analyze . Quantifying the Strength of Evidence 13.6.4. Confidence interval for a difference 13. Data Mining. F test for equality of two or more Normal means (Analysis of Variance) (valid only if all standard deviations are the same) © M.4. Z statistics and the Z transformation 12.2. P values from z or Chi squared distributions. t test for equality of two Normal means 13.3.1.3. ANOVA The geometry of analysis 12. F test for equality of two Normal standard deviations 13.4. Law of likelihood in comparison problems 13.3. Boyles.Model Building. Operational interpretation 13.

14.10.4. 14.4 Polynomial regression 13.10.10.1. Predicted mean values 13. some are useful. Chi-square tests 13. Testing for lack of fit 13.3.4.” 13.6.2. Tests of association in contingency tables 13. Linear Regression Models 13.3.4.4. Analyze data from optimization experiments. Fitting Regression Models 13.4.4.10. Scatter Diagrams (Review and Reinforcement) 13.2.1.6.3.10.10.10. Homework with Project Focus 14.10.9.2. Create matrices for optimization experiments. Testing for significance of predictor variables 13.7. Confidence intervals for predicted individual value 13. Homework Review Focused on Projects 14.248 Appendices 13. “All models are wrong.6.2.2.10. 2003 .10. Learning Objectives 14.6.3.10. Life Data (Time to Failure) 13.1.5.3.4.1.10. The RMS error 13. Interpret and apply results from optimization experiments.1.10. All Rights Reserved.1.8. Daniel Sloan and Russell A.1.3.2. Regression Analysis 13.1.1.3. Regression diagnostics 13. 14.5.5.3. Calculate sample sizes for optimization experiments.1. Straight-line regression 13.10.3.1.4.10. Confidence intervals for predicted mean values 13. Likelihood ratio test for equality of two or more Weibull distributions 13. The dangers of R2 13.10. Residual plots 13. Introduction to Experimentation © M.6. JMP exercises 13. The least squares estimates 13. Correlation is not causation 13. Workshop: Pennies for Your Thought 13. Multiple regression 13.2.8.1.8. Boyles. 14.7. 14.10. Interpreting the table of the Chi-square distribution 13.7. Improve – Experimental Design and Analysis 14.10.10.3.10.1. Be able to explain the difference between optimization and screening experiments.8. Workshop: Pennies for Your Thought 13.10.10.4.

4. Describe iterative strategy for experimentation 15. Bold strategy 14. Appendices 249 14.5.8. Boyles.9. Basic Design Process 14.10. Categorical 14.4.1 JMP Steps 14.7.4. Response 14.8. Design matrix 14. Workshop: The Funnel Process 15. Sample Size 14.3.6.1.8. Control group 14. Blocking 14.5.10.5.3. Design principles 14. Calculate sample sizes for robust optimization experiments.7.2 Exercises 14.2. Daniel Sloan and Russell A.2. Perform multiple response analysis.1. All Rights Reserved.1. Noise 14. Create matrices for robust optimization experiments. Types of factors 14.3.4.3.2.3. Factorial structure 14. Do not experiment with one factor at a time! (OFAT Review and Reinforcment) 14.5. Modified Design Process 14.7.1.4.4. 15.5.4.9.1.1. © M.1. Learning Objectives 15.8.7. Randomization 14.Process Optimization and Control 15.5.5. JMP Steps 14.2.2 When should I do experiments? 14.1. Level 14. Experiments with All Factors at Two Levels 14.1.4.7.7.7. Exercises 14. Design Point 14. Control 14. 15. Concepts and Definitions 14.2.6.1.11. Replication 14. DOE Terminology 14. Experimental Unit 14.9.1 Why should I do experiments? 14.4. Screening Experiments 14.3.4. 2003 .4.1.8. Continuous 14.6.3.4. Factor 14. Examples.7. Improve . Examples 14.4.

15. Multi-level Optimization Experiments 15.4. Standard assumptions 15.5. All Rights Reserved. 16.250 Appendices 15.9.4.8. 15.7. The method of least squares 15.5. Testing for lack of fit 15.3.5.5.7. Statistical Testing 15.8.1. © M.1.4. Testing model coefficients 15. Review of Designed Experiments Homework 15.8.2.1.Optimization Experiments and Statistical Process Control 16.3.1. process improvements 15.6. Models for categorical factors 15. Exercises 15.1.1. 16. Strategies for experimentation 15. Establishing baselines 16. Types of experiments 15. Monitoring low failure rates. Analyze data from robust optimization experiments 15.4.2. Understand common cause and special cause variation.3.8.3. 15.4. Sample size calculations 15.2.3. Homework: Design of Experiments Project Focus: Report project results in DMAIC format for final class. Design process 15.3. Exercises 15. Predicted values and residuals 15.2.3. Example and JMP exercises 15. Describe a Reaction plan to out of control conditions.1. The experimental cycle 15.2. The Process of Experimentation 15. Continuous × categorical interactions 15.2.3. 16.1.3. Statistical Modeling 15. Quadratic models for continuous factors 15.11.3. Multiple response analysis. Daniel Sloan and Russell A. Models for continuous factors 15.1.5.6.4.4.5.4.7.1. Control .5.1.7. Quadratic models for continuous factors 15.7.1.5. 2003 . Boyles.8.10. Example and JMP exercises 15. Rational sub-grouping. Learning Objectives 16.1. Response surface analysis 15.2. Workshop: the Funnel Process using robust optimization and quality control.4. Continuous × categorical interactions 15.1.

3. The concept of robust optimization 16. not the exception 16.2. Optimizing one response at a time will not work 16.5.5.2. JMP exercises 16.2.5. Optimize the mean 16.6.1.2. Minimize the variance 16.4. Maximize overall desirability 16.5.5. Boyles.1.4. Short-Run SPC 16. Constructing a desirability function for each response 16.5.4. Robust Optimization Experiments 16.5.4. More on Sample size calculations 16.5. Strategy for design of robust optimization experiments 16. 16.3. 16. Desirability functions 16. Thought process for designing an experiment 16. Apply multiple response technique 16. Review of Designed Experiments Homework 16.2.1.3. Maximizing the overall desirability 16.4.2.4.3.1. Acceptance sampling and broken promises.4. Appendices 251 16. Strategy for analysis of robust optimization experiments 16.5.2.6.5.1.4.1. Multivariate statistical process control 16.5.5.5.3.3.3.5. Hands on SPC experiments and software practice Workshop: the Funnel Process Exercises Summaries for Quick Reference 16. Examples 16.3.3.1.1.2.1.4. Multiple Response Optimization 16. Include noise factor in the design 16.6. The three types of response objective 16.5.2. Statistical Process Control as a mind set and strategy. © M.5.4.1. Define noise factor 16. Identify key noise variables 16. Seeks best combination of close-to-target mean and low variability 16.6. Optimizing over subsets of the design region 16.5.4.3. Minimizes variability for a given mean 16. Constructing the overall desirability 16. 2003 .3.2.5.3.1. Daniel Sloan and Russell A.5. “Multiple responses” is the rule.2. All Rights Reserved.3.4.1.5.5.5.3. Example 16. Homework: Design of Experiments Project Focused Report project results in DMAIC format.

We carry only the inventory we need for personal. perfect bound paperback version on demand in a pull-production system. 2003.0® statistical software. Daniel Sloan and Russell A. software analysis application. We also use Minitab with clients who have that standard. The Internet. • JMP 5. was our favored analytic program. We began by creating the Profit Signals title on June 18. When necessary. Our lean production system included two authors. just-in-time. democratic values. the completion of our book on this day was an appropriate way to celebrate liberty. manufactured by SAS. All Rights Reserved. © M. Using Excel for data matrix vector analysis shows the amount work required before a spreadsheet behaves like a reliable. We produced the electronic versions of our book independently.252 Appendices IV. Though it was an entirely Chance coincidence. we retained the illustration services of expert contractors. and software allowed us to complete the entire writing and production of the book in 90 days. computing power. Kinko’s prints the four color cover. Six Sigma level knowledge and skills in every aspect of the production of this book. Evidence-based decisions are as important to world peace as they are to prosperity. The applications that played primary roles are as follows: • Microsoft Word® 2000 and 2002 were our primary composition tools. art and the pursuit of happiness. Boyles. We completed the work in PDF format on September 11. rules-driven. applied science. equality. Profit Signals Production Notes We consciously chose to demonstrate Senior Master Black Belt. 2003 . This is the classic Six Sigma project time line. corporate use. freedom of speech. • Microsoft Excel® was used for spreadsheet screen captures and some graphics.

0. All Rights Reserved.0 Professional helped us disseminate copies for review.0 was used for certain photographic and graphic illustrations. Appendices 253 • Microsoft Explorer was the web browser we used for Internet research.2 allowed us to design. • Adobe In Design® 2. • Adobe Illustrator® 10 transformed all illustrations into EPS files for production. it is not only a reasonable expectation for Black Belts. •Adobe Photoshop® 7. In our opinion. • Microsoft Power Point® was frequently used by Russell for first draft. Boyles. a Hewlett- Packard LaserJet 1300 and an hp officejet v40xi jet printer produced hard copy for old fashioned proof reading and review. • Dell desktop and laptop computers. • Quality America ‘s Excel add-in. Master Black Belts and Executive Champions to use a similar list of programs in their daily work. technical drawings. layout and construct our book. Daniel Sloan and Russell A. We love them. Austin and Molly. a data matrix based flow diagramming program. • Process Model®. • Crystal Ball by Decisioneering® was the Excel add- in we used to make this spreadsheet behave like a data matrix. it is essential to Six Sigma powered project breakthroughs. Profound thanks are due to our wives. four individuals went well beyond the call of duty as we © M. and our wonderful children. • Adobe Acrobat® 6. was used to create flow diagrams. In addition to the entire Adobe products technical support team. Patience is their virtue. 2003 . Statistical Process Control program was used to produce a control chart. Lynne and Michelle.

nurse. colleague. very. Onwards and upwards. our friend Bill Moore. The specificity of his constructive criticisms and the solutions he proposed strengthened the quality of our work immeasurably. Bethany and Bill. She did a Six Sigma quality job on a pressure packed deadline. She also encouraged us to tackle the cost accounting variance and break-even thinking head on with the Premise’s second illustration. Austin. Cheryl Payseno. Figure 2. 2003 . leadership and masterful management skill there would be no Profit Signals. very much Lynne. Daniel Sloan and Russell A.” © M. volunteered invaluable editorial support. Boyles. the President of MedCath. Finally. Michelle. our friend. and final copy proof-reader Bethany Quillinan stepped into the fray to help us see our words through yet another set of eyes. Good on ya’ matey. and former hospital administrator volunteered her case study on Breaking the Time Barrier. Incorporated. colleague. Without Jack’s vision. Hospital Division. All Rights Reserved. Molly. Cheryl. Jack. Our friend. “Thank you very.254 Appendices produced Profit Signals. So. Jack Benham introduced us in July of 2002.

Index A Adams. 51 Analysis of Variance 37. 191 Black Belt 19. Daniel Sloan and Russell A. 13. 154. All Rights Reserved. 136 Calder. 92. 33. 223 cornerstone of evidence 10. 117 Cohen. 115. Alexander 165 Case Studies 21. 76. 153. 118. Boyles.P. 14. 10. George E. 179 C CABG 34. 222 ANOVA 37. 132 control chart 110. 53. 2003 . 57 belt grinding 142 Bernstein. 55. 203. 58. 79. 208 © M. 174 analysis 9. 37. 21. Bernard 91 Confidence Level 81. 161. 113. 153 break-even analysis 11. 90 Box. 88 Archimedes 46 Aristotle 152. 221 B Bamboozle 56. 175. John 217 Aladdin 179 analogy 44. Peter L.

53. 11. 2003 . 96. 167. 21. 19. All Rights Reserved. 194 credulity 50 critical thinking 57 Critical to Quality 99 CTQ 99 cube 38 cynicism 118 D Darwin. 15. 29 data matrix geometry 124 da Vinci. 178 Disraeli 118 DMAIC 21. 12. 54. Daniel Sloan and Russell A. Charles 178 data matrix 9. 18. 74 Disney. 48 differences 32. 179 Cost of Poor Quality 111 Cpk 114. 13. 119. 177.256 Index correlation 177 Corrugated Copters 22. 168 Delusions 56 Design of Experiments 44. 91 Einthoven. 16. 23. 20. Leonardo 43 defects per million 92 degrees of freedom 65. 194 cost-accounting variance analysis 11. Albert 43. 14. 46. 130 E Einstein. Ralph Waldo 222 Euclid 46 evidence-based decision 10. Willem 33 EKG 33 emergency department 128 Emerson. 18. 21. 22. Walt 47. Boyles. 53. 23 Executive Committee 209 F Fads and Fallacies 85 Failure Mode Effects Analysis (FMEA) 112 feedback 136 © M.

Index 257 Feigenbaum. 178 Galvin. Charter 179 Hidden Factory 218 hidden factory 108. All Rights Reserved. 178 generalization 9. Darrell 52 Hunter. Armand V. Francis 55. Richard P. William 74 Hunter. Boyles. 110 Feynman. Ronald A. G. Stephen Jay 177 Guinness 32 H Harrison. 194 fields 59 fingerprints 178 Fisher. Robert 87 Gantt. 223 George E. 43 Five-Minute PhD 20. 60 I Imagineering 31 J JCAHO 128 Jefferson. 43. J. 31. Box 21 Gosset. Henry L. Stuart 74 hyperspace 38. 97 Generalization 9. 39. 152 G G. 2003 . 111 Hill. Daniel Sloan and Russell A. 19. Charter Harrison 179 GAAP 55 Galileo 223 Galton. Thomas 217 © M.P. Sir Austin Bradford 135 Huff. 16 Generally Accepted Accounting Principles 57 General Electric 110. William 32 Gould.

153 Michelangelo 43 Minitab 88 Motorola 223 multiplication 15 N NASA 194 Netter. 81. 19. 158 lean 108 Length Of Stay 130 M Mackay. Boyles. Daniel Sloan and Russell A. 175 Normal distribution 70 n dimensions 31 O OFAT 103 © M. 66. James Clerk 91 measurements 9. Frank 33 Newton. John 50 knowledge 36. Charles 56 main effect 39 Marconi 43 math phobia 220 Matreshka 106 Maxwell. Isaac 50 New Management Equation 63. 39 L law of the universe 9. All Rights Reserved. 2003 . 161.258 Index JMP 131 Joint Commission 128 K Kaizen-blitz 145 Keats.

209 Simulation 100 SIPOC 155 Sisyphus 180 Six Sigma 18. 44. 12 Rothamsted 32 Russian dolls 106 S Sagan. Index 259 P P-value 81 p-value 72. 2003 . Pablo 37 predicted values 11. Walter A. Carl 162 sample size 59 sample standard deviation 64 scientific management 13 Sculpey Clay 10. 171 perpendicular planes 41 PERT 97 Picasso. Daniel Sloan and Russell A. 140. 89 Six Sigma theory 19 © M. Paine. 165 process capability 113 process maps 94 Profit Signals 44 Pythagoras 21 Pythagorean Theorem 13 Q quarterly review 207 R reasoning 81 records 123 refraction 50 regression modeling 177 Ronald Fisher 9. Thomas Paper Bags 14 Pareto chart 127. 164 Shewhart. Boyles. All Rights Reserved.

Daniel Sloan and Russell A. 2003 . 70. James 47 Twain. 171. Frederick W. 103 tetrahedron 10. 152 vector analysis 9. 49. 160 Stories 50 straight-line prediction 181 straw man 76 strength of evidence 83 T Taylor. 208 © M. 13. 152. 165 Themis 83 Three Rs 23 Transparency 13 Turrell. 21. Bill 18 spreadsheet 80 spreadsheet analysis 14 standards of evidence 20. Mark 59 V variation 10 vector 10. 158. All Rights Reserved. 76.260 Index Six Sigma tools 19 Smith. 60. 84. Boyles. 10.