Beruflich Dokumente
Kultur Dokumente
Teaching Notes
The precise manner in which a person or team approaches product design, solving problems to
achieve product excellence, or developing product reliability is not as critical as doing it in a
systematic fashion. Students have been exposed to process management and improvement in
Chapter 7, but they may still have some difficulty in understanding how measurement (metrology)
and Six Sigma projects can be used at the design stage to make frequent, but gradual changes as
an approach to process improvement.
Key objectives for this chapter should include:
To introduce the concept of Design for Six Sigma (DFSS) consisting of a set of tools and
methodologies used in the product development process to ensure that goods and services
meet customer needs and achieve performance objectives, and that the processes used to
make and deliver them achieve Six Sigma capability. DFSS consists of four principal
activities of: Concept development, Design development, Design optimization, and
Design verification. These activities are often incorporated into a variation of the
DMAIC process, known as DMADV, which stands for Define, Measure, Analyze,
Design, and Verify.
2
Design for Quality and Product Excellence
To study social responsibilities in the design process including product safety and
environmental concerns, which have made Design for Environment (DfE) and design
for disassembly important features of products, because they permit easy removal of
components for recycling or repair, eliminate other environmental hazards, and makes
repair more affordable.
To explore Design for Excellence (DFX), an emerging concept that includes many
design-related initiatives such as concurrent engineering, design for manufacturability
design for assembly, design for environment and other design for approaches. DFX
objectives include higher functional performance, physical performance, user friendliness,
reliability and durability, maintainability and serviceability, safety, compatibility and
upgradeability, environmental friendliness, and psychological characteristics.
A scientific approach to tolerance design uses the Taguchi loss function. Taguchi
assumes that losses can be approximated by a quadratic function so that larger deviations
from target correspond to increasingly larger losses. For the case in which a specific target
value, T, is determined to produce the optimum performance, and in which quality
deteriorates as the actual value moves away from the target on either side (called nominal
is best), the loss function is represented by L(x) = k(x - T)2.
3
Design for Quality and Product Excellence
To learn that design optimization includes setting proper tolerances to ensure maximum
product performance and making designs robust; a scientific approach to tolerance design
uses the Taguchi loss function. Techniques for design verification include formal
reliability evaluation, using techniques such as accelerated life testing and burn-in.
4
Design for Quality and Product Excellence
adequacy of measurements, and is a vital part of global competitiveness, including
characteristics such as: accuracy, precision, repeatability or equipment variation,
reproducibility or operator variation, calibration and traceability.
To appreciate that process capability is the range over which the natural
variation of a process occurs as determined by the system of common causes; that is, what
the process can achieve under stable conditions. The relationship between the natural
variation and specifications is often quantified by a measure known as the process
capability index, Cp.
The general definition of reliability as: the probability that a product, piece of equipment,
or system performs its intended function for a stated period of time under specified
operating conditions, is thoroughly tested by Shure. Tests are tailored to various market
segments, according to the type of use (or abuse) the equipment is likely to incur. For the
consumer market, Shure uses the cartridge drop and scrape test, which is particularly
important to test for, in the light of how scratch DJs use the equipment. For
presentation and installation audio systems, they use the microphone drop test and
perspiration test. For mobile communications, the two above tests, temperature, and
cable and cable assembly flex tests are applicable. For the performance audio, the
microphone drop test, perspiration test, sequential shipping, cable and cable assembly
flex, and temperature storage would all be appropriate. The purpose of the tests is to
simulate actual operating conditions so that the products can sustain accidents and rough
handling and perform effectively over a useful life. Quality characteristics that are studied
are achieved reliability and performance.
2.
For the microphone drop test, the measures are probably variable measures of sound and
response levels, within an acceptable range. Thus, standard variables control charts may
be used. For the perspiration test, it may be that a p-chart or u-chart is used for attribute
measures. The cable and cable assembly flex test might use a p-chart to measure the
percentage of cables tested that failed due to rocking motions or twisting motions. The
sequential shipping tests would probably show varying proportions of failures due to
dropping, vibration, and rough handling. These might be sorted out using a Pareto chart.
Then efforts could be made to improve the most frequently occurring causes. The
cartridge drop and scrape test could also use p- or np-charts (see Chapter 13) to show
results per sample of 100 repetitions of the test. The temperature tests would most likely
use standard variables charts to measure whether test performance was within control
limits, or not.
5
Design for Quality and Product Excellence
Applying QFD in a Managed Care Organization
1.
Although this example of QFD involved the design of a tangible items, it is more difficult
to implement in a service context, as opposed to a pure manufacturing context, because
both customer requirements and technical requirements are harder to quantify and assess
that with tangible products.
2.
The detailed calculations in the Importance of the hows row and Percentage of
importance of the hows row used to arrive at these figures can be shown and verified on a
spreadsheet. Note that some discrepancies involving incorrect multiplication, were found
in part of the QFD House of Quality.
Direction of
Improvement
Ease-use
Accuracy
Timeliness
Clarity
Conciseness
Import. of hows
% of Import. of
hows
Rate
of
Import
.
4.5
5.0
3.2
3.8
2.5
Co.
No
w
3.2
3.1
3.8
2.6
4.1
Pla
n
4.5
4.6
3.8
3.9
4.1
Rate
of
Impro
v.
1.4
1.5
1.0
1.5
1.0
Abso
l.
Wgt.
6.3
7.4
3.2
5.7
2.5
%
Improv
e
25.2%
29.5%
12.7%
22.7%
9.9%
Font
size
3
1
1
108.1
5.65
%
Updat
e
1
9
9
1
427.9
22.35
%
Photo
s
3
3
1
153.4
8.01
%
Use
of
color
s
3
Gloss.
Q&A
Terms
9
1
Sect.
3
3
1
3
98.2
5.13
%
460.0
24.03
%
244.7
12.78
%
Tbl.
of
Contn
t.
9
Lang.
Frindl
y.
3
1
249.1
13.01
%
173.0
9.04
%
The numbers in the original table were verified by the calculations shown above (some columns of the original table were rearranged for
convenience of calculation). The rates of improvement, absolute weights, and percent improvements, based on the given values for rate
of importance and company now and plan were validated. As in the original table, the importance of hows and percent of
importance of hows turned out to be accurately calculated. Specific factors shown as the most important were glossary terms and
updates.
3.
The lessons that can be learned and applied to other service organizations that seek to
design or redesign their products and services include the facts that QFD provides for a
systematic approach to linking the voice of the customer to operational requirements.
By doing so, operating efficiencies can be realized and customer satisfaction can be
enhanced. In addition, employee satisfaction often can be improved, as well, as found in
the case. It must be recognized that time and effort is involved in gathering, sorting, and
analyzing the characteristics and factors. Also, there is subjectivity in applying ratings and
weights to variables. Hence, the results are not easy to predict and guarantees are limited.
Preliminary Concept Development. In this phase, new ideas are studied for
feasibility.
Full-Scale Production. If no serious problems are found, the company releases the
product to manufacturing or service delivery teams.
2.
Competitive pressures are forcing companies to reduce time to market, which means that
the time for product development is also squeezed. The problems incurred in speeding up
the process are well known. If done too hastily, the result will be the need to revise or
scrap the design, cost increases or project over-runs, difficulty in manufacturing the
product, early product failure in the field, customer dissatisfaction, and/or lawsuits due to
product liability. One of them most significant impediments to rapid design is poor intraorganizational coordination. Reducing time to market can only be accomplished by
process simplification, eliminating design changes, and improving product
manufacturability. This requires involvement and cooperation of many functional groups to
identify and solve design problems in order to reduce product development and
introduction time.
3.
Design for Six Sigma (DFSS) uses a set of tools and methodologies in the product
development process to ensure that goods and services will meet customer needs and
achieve performance objectives, and that the processes used to make and deliver them
achieve Six Sigma capability. DFSS consists of four principal activities:
Concept development, in which product functionality is determined based upon
customer requirements, technological capabilities, and economic realities;
Design development, which focuses on product and process performance issues
necessary to fulfill the product and service requirements in manufacturing or delivery;
Design optimization, which seeks to minimize the impact of variation in production
and use, creating a robust design; and
Design verification, which ensures that the capability of the production system meets
the appropriate sigma level
4.
QFD benefits companies through improved communication and teamwork between all
constituencies in the production process, such as between marketing and design, between
design and manufacturing, and between purchasing and suppliers. Product objectives are
better understood and interpreted during the production process. Use of QFD determines
the causes of customer dissatisfaction, making it a useful tool for competitive analysis of
product quality by top management. Productivity as well as quality improvements
generally follow QFD. QFD reduces the time for new product development. QFD allows
companies to simulate the effects of new design ideas and concepts. Companies can
reduce product development time and bring new products into the market sooner, thus
gaining competitive advantage.
6.
In the QFD development process, a set of matrices is used to relate the voice of the
customer to a products technical requirements, component requirements, process control
plans, and manufacturing operations. The first matrix, called the House of Quality,
provides the basis for the QFD concept.
Building the House of Quality consists of six basic steps:
*0 Identify customer requirements.
*1 Identify technical requirements.
*2 Relate the customer requirements to the technical requirements.
*3 Conduct an evaluation of competing products or services
*4 Evaluate technical requirements and develop targets.
*5 Determine which technical requirements to deploy in the remainder of the
production/delivery process.
The first House of Quality in the QFD process provides marketing with an important tool
to understand customer needs and gives top management strategic direction. Three other
houses of quality are used to deploy the voice of the customer to (in a manufacturing
setting) component parts characteristics, process plans, and quality control. The second
house applies to subsystems and components. At this stage, target values representing the
best values for fit, function, and appearance are determined. In manufacturing, most of
the QFD activities represented by the first two houses of quality are performed by product
development and engineering functions.
In the last two stages, the planning activities involve supervisors and production line
operators. In the third house, the process plan relates the component characteristics to key
process operations, the transition from planning to execution. Key process operations are
the basis for a control point. A control point forms the basis for a quality control plan
delivering those critical characteristics that are crucial to achieving customer satisfaction.
This is specified in the last house of quality. These are the things that must be measured
10
and evaluated on a continuous basis to ensure that processes continue to meet the
important customer requirements defined in the first House of Quality.
7.
Product design can have a major impact on manufacturability. If careful thought and
planning is not done by the designer (or design team), the end product can end up being
difficult or impossible to build due to placement of components, methods for attachments,
impossible tolerances, difficulties in attaching or fastening components and/or difficulties
in getting the whole assembled system to work smoothly, even with the highest quality
components. In addition time, materials, and other resources may be wasted unnecessarily
due to a poor manufacturing design.
The concept of Design for Manufacturability (DFM) is the process of designing a product so
that it can be produced efficiently at the highest level of quality. Its goal is to improve quality,
increase productivity, reduce lead time (time to market, as well as manufacturing time) and
maintain flexibility to adapt to future market conditions.
8.
Key design practices for high quality in manufacturing and assembly include: 1) analyze all
design requirements to assess proper dimensions and tolerances, 2) determine process
capability, 3) identify and evaluate possible manufacturing quality problems, 4) select
manufacturing processes that minimize technical risks, and 5) evaluate processes under actual
manufacturing conditions.
9.
Social responsibilities in the design process include safety and environmental concerns,
which have made Design for Environment (DFE) and Design for Disassembly important
features of products. Legal and environmental issues are becoming critical in designing
products and services, today. Product safety and its consequences, product liability, should be
of primary concern because of the damage that hazardous designs can do to consumers of the
product. Also, liability lawsuits can do major damage to the financial health of an
organization, as well as its image and reputation in the marketplace. Records and
documentation relating to the design process are the best defense against liability lawsuits.
These would include records on prototype development, testing, and inspection results.
Environmental issues involve questions of whether environmentally friendly designs (those
that minimize damage to the environment in manufacture and product use) are being
developed, what impacts will the design of the product have on the environment when it is
scrapped, and how can consumers be given the most value for their money, while balancing
the other two issues? The above questions can often be addressed by considering it as a
design for environment concept (often combined with and design for disassembly).
What is the best design for repairability/recylability?
10.
Design for Excellence (DFX) is an emerging concept that includes many design-related
initiatives such as concurrent engineering, design for manufacturability design for
assembly, design for environment and other design for approaches. DFX objectives
include higher functional performance, physical performance, user friendliness, reliability
and durability, maintainability and serviceability, safety, compatibility and upgradeability,
environmental friendliness, and psychological characteristics. DFX represents a total
approach to product development and design involves the following activities:
11
Constantly thinking in terms of how one can design or manufacture products better,
not just solving or preventing problems
Focusing on things done right rather than things gone wrong
Defining customer expectations and going beyond them, not just barely meeting them
or just matching the competition
Optimizing desirable features or results, not just incorporating them
Minimizing the overall cost without compromising quality of function
11.
12.
The Taguchi loss function is a useful concept for process design. Taguchi suggests that
there is not strict cut-off point that divides good quality from poor quality. Rather, he
assumed that losses can be approximated by a quadratic function so that larger deviations
from target correspond to increasingly larger losses. For the case in which a specific target
value, T, is determined to produce the optimum performance, and in which quality
deteriorates as the actual value moves away from the target on either side (called nominal
is best), the loss function is represented by L(x) = k(x - T)2 where x is any actual value of
the quality characteristic and k is some constant. Thus, (x T) represents the deviation
from the target, and the loss increases by the square of the deviation.
13.
The purpose of Design Failure Mode and Effects Analysis (DFMEA) is to identify all the
ways in which a failure can occur, to estimate the effect and seriousness of the failure, and
to recommend corrective design actions. A DFMEA usually consists of specifying the
following information for each design element or function: Failure modes; effect of the
failure on the customer; severity, likelihood of occurrence, and detection rating; potential
causes of failure, and corrective actions or controls. A simple example of a DFMEA for an
ordinary household light socket is provided in the chapter.
14.
Reliability has grown increasingly important among the quality disciplines due to safety needs
of consumers, the search for competitive advantage by companies, growing consumer
awareness, and rising expectations and the difficulty of achieving high reliability in more
sophisticated and complex modern products.
15.
Reliability is the probability that a product, piece of equipment, or system performs its
intended function for a stated period of time under specified operating conditions. There are
four key components of this definition, including probability, time, performance, and
operating conditions. All of these have to be considered in a comprehensive definition of
reliability. Probability allows comparison of different products and systems, time allows us to
12
measure the length of life of the product, performance relates to the ability of the product to
do what it was designed to do, and operating conditions specify to amount of usage and the
environment in which the product is used.
16.
A functional failure is one incurred at the start of the product's life due to defective materials,
components, or work on the product. A reliability failure is one that is incurred after some
period of use. For example, if a new TV set suffers a blown picture tube during the first
week, it's a functional failure. There was obviously a defect in the manufacture of the tube. If
the vertical hold feature of the set goes out (perhaps 3 days after the 1 year warranty is up),
that is a reliability failure. It should reasonably be expected to last much longer than one year,
but it didn't.
17.
Failure rate is defined as the number of failures per unit of time during a specified time
period being considered. For example, if 15 MP-3 players were tested for 500 hours and
there were two failures of the units, the failure rate would be: 2 / (15 x 500) = 1 / 3750 or
0.000267.
18.
The cumulative failure rate curve plots the cumulative percent of failures against time on the
horizontal axis. The failure rate curve is obtained by determining the slope of the failure rate
curve at a number of points to obtain the instantaneous failure rate (failures per unit time) at
that point. A plot of these values yields the failure rate curve.
19.
The average failure rate over any interval of time is the slope of the line between the two
endpoints of the interval on the failure rate curve.
20.
The product life characteristics curve, is the so-called "bath-tub curve" because of its shape.
It is actually the failure rate curve, described above. Such curves can be used to understand
the distinctive failure rate patterns of various designs and products, over time.
21.
The reliability function represents the probability that an item will not fail within a certain
period of time, T. It is directly related to the cumulative distribution function: F(T) =
1 - e-T, that yields the probability of failures. Since F(T) is the probability of failure, the
reliability function, R(T) can be defined as the complement, e.g. probability of not failing:
R(T) = 1 - (1 - e-T) = e-T
It can also be expressed using the mean time to failure (MTTF) value as: R(T) = e-T/
22.
The reliability of series, parallel, and series parallel is relatively easy to compute, given the
reliability of components in each system. For the series system, RS = R1R2R3. Thus reliabilities
are multiplicative.
For a parallel system, the relationships are a little more complex, since the units are designed
to use redundant components, so that if one unit fails the system can continue to operate. The
system reliability is computed as:
RS = 1 - [(1 - R1)(1 - R2)(1 - Rn)]
13
For series-parallel systems, the equivalent reliabilities of each parallel sub-system are
calculated, successively, until there are no more parallel sub-systems. The system is then
reduced to a serially equivalent system in which all component reliabilities can be multiplied
to get the final reliability value.
23.
The purpose of a design review is to stimulate discussion, raise questions, and generate
new ideas and solutions to help designers anticipate problems before they occur. To
facilitate product development, a design review is generally conducted in three major
stages of the product development process: preliminary, intermediate, and final. The
preliminary design review establishes early communication between marketing,
engineering, manufacturing, and purchasing personnel and provides better coordination of
their activities. It usually involves higher levels of management and concentrates on
strategic issues in design that relate to customer requirements and thus the ultimate quality
of the product. The preliminary design review evaluates such issues as the function of the
product, conformance to customers needs, completeness of specifications, manufacturing
costs, and liability issues.
After the design is well established, an intermediate review takes place to study the design
in greater detail to identify potential problems and suggest corrective action. Personnel at
lower levels of the organization are more heavily involved at this stage. Finally, just before
release to production, a final review is held. Materials lists, drawings, and other detailed
design information are studied with the purpose of preventing costly changes after
production setup.
24.
Methods of product testing for reliability include: life testing, accelerated life testing,
environmental testing and vibration and shock testing. In life and accelerated life testing the
product is tested until it fails. The latter speeds up the process by overstressing the item to
hasten its eventual failure. Environmental and shock tests are performed to determine the
product's ability to survive and operate under adverse conditions of heat, cold, or shock.
25.
Latent defects are frequently found in electronic devices, such as semi-conductors. The term
refers to the fact that a certain small proportion of the units will have defects which show up
during the early life of the product, perhaps the first 1,000 hours of operation. Then the
remaining components, after the "infant mortality" period has passed, the remaining
components may operate for years without many failures.
26.
Robust designs are those that are insensitive to variations in manufacturing or in the use
environment.
27.
Common types of measuring instruments (see Bonus Materials folder on the Premier
website) used in manufacturing today fall into two categories: low-technology and
high-technology. Low-technology instruments are primarily manual devices that have
been available for many years and include rulers, calipers, mechanical micrometers, go-no
go gauges, etc.; high-technology describes those that depend on modern electronics,
microprocessors, lasers, or advanced optics, such as micrometers with digital readouts,
electronic optical comparators, and computerized coordinate measuring machines.
14
28.
29.
30.
31.
Repeatability and reproducibility (R&R) require a study of variation and can be addressed
through statistical analysis. R&R studies must be done systematically, and require quite a
number of steps. A repeatability and reproducibility study is conducted in the following
manner (Note: formulas are omitted for the sake of brevity).
1.
2.
3.
4.
Select m operators and n parts. Typically at least 2 operators and 10 parts are
chosen. Number the parts so that the numbers are not visible to the operators.
Calibrate the measuring instrument.
Let each operator measure each part in a random order and record the results.
Repeat this for a total of r trials. At least two trials must be used. Let M ijk represent
the kth measurement of operator i on part j.
Compute the average measurement for each operator and the difference between
the largest and smallest average.
15
Compute the range for each part and each operator (these values show the
variability of repeated measurements of the same part by the same operator);
compute the average range for each operator; compute the overall average range.
Calculate control limits on the individual ranges Rij , using a constant (D4) that
depends on the sample size (number of trials, r) and can be found in a table for
control charts. Any range value beyond the control limits might result from some
assignable cause, not random error. Possible causes should be investigated and, if
found, corrected. The operator should repeat these measurements using the same
part. If no assignable cause is found, these values should be discarded and all
statistics in step 5 as well as the control limit should be recalculated.
Once these basic calculations are made, an analysis of repeatability and
reproducibility can be performed, equipment variation (EV) is computed as
reproducibility, and operator variation as appraisal variation (AV).
Constants K1 and K2 are chosen and depend on the number of trials and number of
operators, respectively. These constants provide a 99 percent confidence interval
on these statistics. An overall measure of repeatability and reproducibility (R&R) is
given by:
Process capability is the range over which the natural variation of a process occurs as
determined by the system of common causes. It is the ability of the combination of people,
machines, methods, materials, and measurements to produce a product or service that will
consistently meet design specifications. Process capability is measured by the proportion
of output that can be produced within design specifications; in other words, it is a
measurement of the uniformity of the product.
33.
34.
The following are brief definitions of the various process capability indexes:
16
Cp is the ratio of the specification width to the natural tolerance of the process
Cpl is the lower one-sided index that relates the distance from the process mean to the
lower tolerance limit to its 3 natural spread
Cpu is the upper one-sided index that relates the distance from the process mean to the
upper tolerance limit to its 3 natural spread
These indexes are calculated to determine the ability of a process to meet or exceed design
specifications and are only meaningful when a process is known to be under control.
General a process is considered to be capable if its index is 1.0 or above. These indexes
may be used to establish quality policy in operating areas or with a supplier by stating an
acceptable standard, such as: all capability indexes must be at 2.0 (called 6 quality) or
above if the process is to be considered acceptable for elimination of inspection processes
by customers.
SOLUTIONS TO PROBLEMS
Note: Data sets for several problems in this chapter are available in the Excel workbook C12Data
on the Premium website for this chapter accompanying this text. Click on the appropriate
worksheet tab as noted in the problem (e.g., Prob. 12-5) to access the data.
1.
Tonias Tasty Tacos conducted consumer surveys and focus groups and identified the most
important customer expectations as
Develop a set of technical requirements to incorporate into the design of a new facility and
a House of Quality relationship matrix to assess how well your requirements address these
expectations. Refine your design as necessary, based upon the initial assessment.
Answer
1.
Analysis of customer responses for Tonias Tasty Tacos indicates that there
are likely to be several strong relationships between customer
requirements and associated technical requirements of the product that
Tonia designs (for example. a burrito), such as value vs. price; nutrition
vs. calories (and other nutritional content values, such as sodium, and
percent trans-fat).
Note the three customer response categories that are unrelated to the
design of the burritos -- order accuracy, speedy service, and menu
17
Size
Moistness
Calo
ries
Sodiu
m
Visually
Appealing
Nutritious
Good Value
Imprtnc
e
12 3
45
Flavor
% t-Fat
Compet.
Eval.
12 3 4
5
Selling
Pts.
12 3 4
5
Competitive
Evaluation:
Newfonia, Inc., is working on a design for a new smartphone. Marketing staff conducted
extensive surveys and focus groups with potential customers to determine the
characteristics that the customers want and expect in a smartphone. Newfonias studies
have identified the most important customer expectations as
Initial cost
Reliability
Ease of use
Features
Operating cost
Compactness
18
Answer
2.
3.
Tonias Tasty Tacos (Problem 1) acquired some additional information. It found that
consumers placed the highest importance on healthy food, followed by value, followed by
order accuracy and service. The menu board was only casually noted as an important
attribute in the surveys. Tonia faces three major competitors in this market: Grabbys,
Tacoking, and Sandys. Studies of their products yielded the information shown in the
table in C12Data file for Prob.12-3 on the Premium website for this chapter. Results of
the consumer panel ratings for each of these competitors can also be found there (a 15
scale, with 5 being the best). Using this information, modify and extend your House of
Quality from Problem 1 and develop a deployment plan for a new burrito. On what
attributes should the company focus its marketing efforts?
Answer
3.
With the new data given for Tonia's customers, a partial House of Quality for the design of
the burritos can be built, as shown below. Note that the relationships between customer
requirements (flavor, health, value) and associated technical requirements (% fat, calories,
sodium, price) of the burrito design are strong.
The inter-relationships of the roof are not shown (limitations of MSWord
software), these may be sketched in. For example, they would show a
strong inter-relationship between fat and calories.
PARTIAL HOUSE OF QUALITY MATRIX
FOR TONIAS TASTY TACOS
Pric
e
Taste
Moistness
Size
Calo
ries
Sodiu
m
% t-Fat
Imprtnc
e
12 3
45
x
Compet.
Eval.
12 3 4
5
G
QS
Selling
Pts.
12 3 4
5
Visua
l
Health
Value
19
Flavor
Visually
Appealing
Nutritious
Good Value
3
4
$0.
26/
oz.
*
5
4
7.0/
oz.
Competitive
Evaluation:
Grabbys
Tacoking
Sandy's
Targets
Deployment
x
x
2
3
80/
oz.
2
3
2
4
13%
85
mg.
QS G
Q
SG
Company
Grabby's
Tacoking
Sandy's
Price/oz.
13
23
16
Calories/oz.
GS Q
SG Q
Sodium/oz. % Fat
$ 0.282
80
13.63
$ 0.300
85
12.67
$ 0.292
90
13.33
20
If Tonias can design a flavorful, healthy, 7 oz. taco and sell it at an attractive price (say,
$1.85 or less), it should be a very profitable undertaking.
4.
Newfonia, Inc. (Problem 2), faces three major competitors in this market: Oldphonia,
Simphonia, and Colliefonia. It found that potential consumers placed the highest
importance on reliability (measured by such things as freedom from operating system
crashes and battery life), followed by compactness (weight/bulkiness), followed by
flexibility (features, ease of use, and types of program modules available). The operating
cost was only occasionally noted as an important attribute in the surveys. Studies of their
products yielded the information shown in the table in C12Data file for Prob.12-4 on the
Premium website for this chapter. Results of the consumer panel ratings for these
competitors are also shown in that spreadsheet. Using this information, modify and extend
your House of Quality from Problem 2 and develop a deployment plan for the new
smartphone. On what attributes should the company focus its marketing efforts?
Answer
4.
With the new data given for Newfonias potential customers, a partial House of Quality
for the design of the smartphone can be built, as shown below. Note the strong
relationships between customer requirements and associated technical requirements of the
smartphone design.
The inter-relationships of the roof are not shown (limitations of MSWord
software), but these may be sketched in. For example, they would show
a strong inter-relationship between size and weight.
Wt.
(oz.)
Featr.
(num.
)
Opr.
Prog
.
Bat
.
Life
Opr.
Cos
t
Import
ance
Compet
Eval.
12 3 45
Selling
Pts.
123
45
GS H
12 3
45
Reliabl
e
Compa
ct
Keeps
operati
ng
Fits
pocket
Not
x
x
GSH
SG Q
21
heavy
Featur
es
Calenda
r,
contact
mgt.,
etc.
Ease of Intuitive
use
operati
ons
Value
Good
value
Competitive
Evaluation:
Oldphonia
Simfonia
Colliefonia
Targets
Deployment
GS H
$25
0
*
5 x
3.2
*
6
oz.
*
10
Win.
CE
35
Mo
d.
QS G
x
SG
= Very strong
relationship
= Strong relationship
= Weak relationship
A genetic researcher at GenLab, Ltd. is trying to test two laboratory thermometers (that
can be read to 1/100,000th of a degree Celsius) for accuracy and precision. She measured
25 samples with each and obtained the results found in the C12Data file for Prob.12-5 on
the Premium website for this chapter. The true temperature being measured is 0 degrees
C. Which instrument is more accurate? Which is more precise? Which is the better
instrument?
Answer
5.
Abs [-.00005 - 0]
*
*
22
100 x
----------------------- =
1 deg.
-0.00251
-0.00169
-0.00086
-0.00003
0.00080
0.00163
0.00246
0.000312
0.000246
#N/A
0.001343
0.000002
0.002456
-0.002514
0.004970
1
1
3
5
5
6
4
23
-0.00221
-0.00070
0.00005
0.00156
0.00232
-0.000046
-0.000123
#N/A
0.001204
0.000001
0.002316
-0.002209
0.004525
1
7
7
7
3
6.
24
Two scales were at Aussieburgers, Ltd. used to weigh the same 25 samples of hamburger
patties for a fast-food restaurant in Australia. Results are shown in C12Data file for
Prob.12-6 on the Premium website for this chapter. The samples were weighed in grams,
and the supplier has ensured that each patty weighs 114 grams. Which scale is more
accurate? Which is more precise? Which is the better scale?
Answer
6.
Abs[113.96 -114]
100 x ------------------------ = 0.035 %
1.685%
114
Scale B
Abs[115.92 - 114]
100 x ----------------------- =
114
25
Cell 1
Cell 2
Cell 3
Cell 4
Cell 5
Cell 6
Cell 7
Upper Cell
Boundaries
Frequencies
112.00
112.67
113.33
114.00
114.67
115.33
116.00
3
0
5
9
0
6
2
113.96
114.00
114.00
1.14
1.29
116.00
112.00
4.00
26
Scale B
Frequency Table Problem 12-6b
Upper Cell
Boundaries Frequencies
Cell 1
Cell 2
Cell 3
Cell 4
Cell 5
114.00
115.33
116.00
117.33
118.00
115.92
116.00
116.00
1.12
1.24
118
114
4.00
3
5
10
5
2
27
7.
The Taguchi Loss Function for PlataLimpia, Inc. part is: L(x) = k (x - T)2
$15 = k (0.025)2
k = 24000
L(x) = k (x - T)2 = 24000 (x - T)2
8.
A team was formed to study the dishwasher part at PlataLimpia, Inc. described in Problem
7. While continuing to work to find the root cause of scrap, they found a way to reduce
the scrap cost to $10 per part.
a. Determine the Taguchi loss function for this situation.
b. If the process deviation from target can be held at 0.015 cm, what is the Taguchi loss?
Answer
8.
9.
A specification for the length of an auto part at PartsDimensions, Inc. is 5.0 0.10
centimeters (cm). It costs $50 to scrap a part that is outside the specifications. Determine
the Taguchi loss function for this situation.
Answer
9.
28
A team was formed to study the auto part at PartsDimensions described in Problem 9.
While continuing to work to find the root cause of scrap, the team found a way to reduce
the scrap cost to $30 per part.
a. Determine the Taguchi loss function for this situation.
b. If the process deviation from target can be held at 0.020 cm, what is the Taguchi loss?
Answer
10.
11.
Ruido Unlimited makes electronic soundboards for car stereos. Output voltage to a certain
component on the board must be 12 0.2 volts. Exceeding the limits results in an
estimated loss of $50. Determine the Taguchi loss function.
Answer
11.
12.
29
b) EL(x) = k (
13.
+ D2) = 9 ( 12 + 02 ) = $9
An automatic cookie machine must deposit a specified amount of 25 0.2 grams (g) of
dough for each cookie on a conveyor belt. If the machine either over- or underdeposits the
mixture, it costs $0.02 to scrap the defective cookie.
a. What is the value of k in the Taguchi loss function?
b. If the process is centered on the target specification with a standard deviation of 0.06 g,
what is the expected loss per unit?
Answer
13.
14.
= 0.06
2
A computer chip is designed so that the distance between two adjacent pins has a
specification of 2.000 0.002 millimeters (mm). The loss due to a defective chip is $2. A
sample of 25 chips was drawn from the production process and the results, in mm, can be
found in the C12Data file for Prob.12-14 on the Premium website for this chapter.
a. Compute the value of k in the Taguchi loss function.
b. What is the expected loss from this process based on the sample data?
Answer
14.
30
= 0.00104
x
a)
b)
$0.544
15.
L(x) = k (x - T)2
$2 = k (0.002)2
k = 500,000
2
EL(x) = k (
In the production of transformers, any output voltage that exceeds 120 15 volts is
unacceptable to the customer. Exceeding these limits results in an estimated loss of $450.
However, the manufacturer can adjust the voltage in the plant by changing a resistor that
costs $2.25.
a. Determine the Taguchi loss function.
b. Suppose the nominal specification is 120 volts. At what tolerance should the
transformer be manufactured, assuming that the amount of loss is represented by the cost
of the resistor?
Answer
15.
a)
b)
4.50
= 2.12 volts
2.12 = x - 120
x = 122.12
16.
31
unacceptable to the customer. Exceeding these limits results in an estimated loss of $200.
However, the manufacturer can still adjust the voltage in the plant by changing a resistor
that costs $2.00.
a. Determine the Taguchi loss function.
b. Suppose the nominal specification remains at 120 volts. At what tolerance should the
integrated circuit be manufactured, assuming that the amount of loss is represented by the
cost of the resistor?
Answer
16.
200 = k (5)2
k =8
So, L(x) = 8 (x-T)2
b) The Taguchi Loss function is: L(x) = k (x - T)
$2.00 = 8 (x-120)2
0.25 = (x - 120)2
(x - T)Tolerance =
0.25 =
0.5 volts
0.5 = x - 120
x = 120.5
17.
Two processes, P and Q, are used by a supplier to produce the same component, Z, which
is a critical part in the engine of the Air2Port 778 airplane. The specification for Z calls for
a dimension of 0.24 mm 0.03. The probabilities of achieving the dimensions for each
process based on their inherent variability are shown in the table found in the C12Data file
for Prob.12-17 on the Premium website for this chapter. If k = 60,000, what is the
expected loss for each process? Which would be the best process to use, based on
minimizing the expected loss?
Answer
17.
For the Air2Port 778 plane parts (see spreadsheets Prob12-17.xls for detailed calculations):
32
Loss ($)
0.20
0.21
0.22
0.23
0.24
0.25
0.26
0.27
0.28
Expected
96.00
54.00
24.00
6.00
0.00
6.00
24.00
54.00
96.00
Loss
Process P
Probabilit
y
Weighted
Loss ($)
Process Q
Probability
Weighted
Loss ($)
0
0.12
0.12
0.12
0.28
0.12
0.12
0.12
0
0.00
6.48
2.88
0.72
0.00
0.72
2.88
6.48
0.00
20.16
0.02
0.03
0.15
0.15
0.30
0.15
0.15
0.03
0.02
1.92
1.62
3.60
0.90
0.00
0.90
3.60
1.62
1.92
16.08
Therefore, Process Q incurs a smaller loss than Process P, even though some output of Q
falls outside specifications.
18.
The average time to handle a call in a the Call-Nowait call processing center has a
specification of 6 1.25 minutes. The loss due to a mishandled call is $16. A sample of 25
calls was drawn from the process and the results, in minutes, can be found in the C12Data
file for Prob.12-18 on the Premium website for this chapter.
a. Compute the value of k in the Taguchi loss function.
b. What is the expected loss from this process based on the sample data?
33
Answer
18.
a)
b)
E [L(x) = k (
$8.218
19.
Compute the average failure rate during the intervals 0 to 40, 40 to 70, and 70 to 100, and
0 to 100, based on the information in Figure 12.28.
Answer
19.
34
The life of a cell phone battery is normally distributed with a mean of 900 days and
standard deviation of 50 days.
a. What fraction of batteries is expected to survive beyond 975 days?
b. What fraction will survive fewer than 800 days?
c. Sketch the reliability function.
d. What length of warranty is needed so that no more than 10 percent of the batteries will
be expected to fail during the warranty period?
Answer
20.
Therefore, P(x > 875) = 0.5 - 0.4332 = 0.0668 or 6.68% should survive
beyond 975 days.
800 -900
b) P (x < 800) = P (z < -------------) = P (z < -2.0 )
50
Therefore, P (x < 880) = 0.5 - 0.4772 = 0.0228 or 2.28% should survive
less than 800 days.
c) The reliability function looks approximately
spreadsheet Prb12-20.xls for details):
as
follows
(see
35
21.
P (x < xw) = 0.10; z = -1.28, for z = x-900 = -1.28 , xw = 836 hours for
50
warranty limit.
Lifetred, Inc., makes automobile tires that have a mean life of 75,000 miles with a
standard deviation of 2,500 miles.
a. What fraction of tires is expected to survive beyond 77,250 miles?
b. What fraction will survive fewer than 68,750 miles?
c. Sketch the reliability function.
d. What length of warranty is needed so that no more than 10 percent of the tires
will be expected to fail during the warranty period?
Answer
21.
77250 - 75000
P(75000 < x < 77250)= P(z < -------------------) = P (0 < z < 0.9) =
0.3159
2500
Therefore, P(x > 77250) = 0.5 - 0.3159 = 0.1841 or 18.41% should
survive beyond 77250 miles.
36
68750- 75000
- 6250
b) P (x < 68750) = P(z < -------------------- ) = P(z < ---------- ) = P (z <
-2.50) =
2500
2500
0.5 - P(68750 < x < 75000) = 0.5 - 0.4938 = 0.0062
Therefore, P (x < 68750) = 0.0062 or 0.62% should survive less than
68750 miles.
c)
Massive Corporations tested five motors in an 800-hour test. Compute the failure rate if,
three failed after 200, 375, and 450 hours and the other two ran for the full 800 hours
each.
Answer
23.
37
3
hour
[(2 x 800) + 200 +375 + 450]
= 0.001143 failures /
2625
Livelong, Inc.s computer monitors have a failure rate of 0.00005 units per hour. Assuming
an exponential distribution, what is the probability of failure within 10,000 hours? What is
the reliability function?
Answer
23.
24.
= .
000015. Find the mean time to failure (MTTF). What is the probability
that the component will not have failed after 12,000 hours of operation?
Answer
24.
The MTTF is =
so, = 66666.67
-0.18
The MTBF of an integrated circuit made by IceeU, Inc. is 18,000 hours. Calculate the
failure rate.
Answer
25.
38
18000
26.
The components that can be purchased from three different suppliers. The reliabilities of
the components are as follows:
Component
Supplier 3
A
B
C
Supplier 1
Supplier 2
.97
.85
.95
.92
.90
.93
.95
.90
.88
Transportation and purchasing considerations require that only one supplier be chosen.
Which one should be selected if the radio is to have the highest possible reliability?
Answer
26.
27.
Answer
39
27. The reliability of the parallel R cc shown in the diagram above the
problem, is calculated as:
Rcc = 1 - (1 - 0.85) 2 = 0.98
RaRbRccRd = (0.98)(0.95) (0.98)(0.99) = 0.903
28.
28.
29.
Manuplex, Inc. has a complex manufacturing process, with three operations that are
performed in series. Because of the nature of the process, machines frequently fall out of
adjustment and must be repaired. To keep the system going, two identical machines are
used at each stage; thus, if one fails, the other can be used while the first is repaired (see
accompanying figure).
40
Reliability
.70
.80
.95
a. Analyze the system reliability, assuming only one machine at each stage (all the backup
machines are out of operation).
b. How much is the reliability improved by having two machines at each stage?
Answer
29.
30.
30.
41
31.
A gauge repeatability and reproducibility study at Frankford Brake Systems collected the
data found in the C12Data file for Prob.12-31 on the Premium website for this chapter.
Analyze these data. The part specification is 1.0 0.06 mm.
Answer
31.
R1
xD
= 0.1014
42
Note: any ranges beyond this limit may be the
result of assignable causes. Identify and correct.
Discard values and recompute statistics.
A gauge repeatability and reproducibility study was made at Precision Parts, Inc., using
three operators, taking three trials each on identical parts. The data that can be found in
the C12Data file for Prob.12-32 on the Premium website for this chapter were collected.
Do you see any problems after analyzing these data? What should be done? The part
specification for the collar that was measured was 1.6 0.2 inches.
Answer
32.
R1
Also, use this method to calculate values for the third operator:
x3
xD
D4 = 2.574 ; UCLR = D4
(EV) 2 + (AV) 2
43
= 0.3843
0.117
0.058
Repeatability (EV)
Reproducibility (AV)
Repeatability and Reproducibility (R&R)
Control limit for individual ranges
Note: any ranges beyond this limit may be the result
of assignable causes. Identify and correct. Discard
values and recompute statistics.
Tolerance analysis
0.358
89.47%
0.142
35.58%
0.385
96.28%
0.302
For sample statistics at Mach3 Tool Co. of: x = 0.5750; = 0.0065 and
a tolerance of 0.575 0.007
Cp = UTL - LTL = 0.582 - 0.568 = 0.359; not capable - unsatisfactory
6
6 (0.0065)
44
0.5750
0.5820
0.5680
Average
Std. deviation
0.5750 Cp
0.0065 Cpl
Cpk
0.3567
0.3547
0.3587
0.3547
34. Adjustments were made in the process at Mach3 Tool Co., discussed in Problem 33 and 25
more samples were taken. The results are given in the C12Data file for Prob.12-34 on the
Premium website for this chapter. What can you observe about the process? Is it now capable
of producing within acceptable limits?
Answer
34. For sample statistics of:
0.575 0.007
The standard deviation is smaller than previously, indicating less spread within the data.
See spreadsheet P12-35.xls for more descriptive analysis.
inside
6 (0.0017)
acceptable
Note, however, that the other process capability indexes, below, show that there are still
some slight problems with process centering that must be addressed.
Nominal specification
Upper tolerance limit
Lower tolerance limit
35.
0.5750
0.5820
0.5680
Average
Std. deviation
0.5755 Cp
0.00169 Cpl
Cpu
Cpk
From the data for Kermit Theatrical Products, construct a histogram and
1.3838
1.4866
1.2810
1.2810
estimate the
process capability. If the specifications are 24 0.03, estimate the percentage of parts that
will be nonconforming. Finally, compute Cp, Cpu, and Cpl. Samples for three parts were
taken as shown in the C12Data file for Prob12-35 on the student Premium website for this
chapter.
45
Answer
35. Summary statistics and the histogram from spreadsheet Prob12-35.xls
show:
Column 1
Mean
Standard Error
Median
Mode
Standard Deviation
Sample Variance
Kurtosis
Skewness
Range
Minimum
Maximum
Confidence
Level(95.0%)
24.0014
0.00097
24.001
24.000
0.00967
9.4E-05
0.53132
0.05271
0.058
23.971
24.029
0.00192
Bin
23.971
23.977
23.983
23.988
23.994
24.000
24.006
24.012
24.017
24.023
More
Frequency
1
0
0
7
14
26
20
19
7
5
1
Histogram
30
20
Frequency
15
10
5
Bin
More
24.023
24.017
24.012
24.006
24.000
23.994
23.988
23.983
23.977
0
23.971
Frequency
25
46
Samples for three parts made at River City Parts Co. were taken as shown in the C12Data
file for Prob.12-36 on the Premium website for this chapter. Data set 1 is for part 1, data
set 2 is for part 2, and data set 3 is for part 3.
a. Calculate the mean and standard deviations for each part and compare them to the
following specification limits:
Part
1
2
3
Nominal
1.750
2.000
1.250
Tolerance
0.045
0.060
0.030
47
b. Will the production process permit an acceptable fit of all parts into a slot with a
specification of 5 0.081 at least 99.73 percent of the time?
Answer
36.
xT
= 0.0188
Omega Tecnology Ltd. (OTL) is a small manufacturing company that produces various
parts for tool manufacturers. One of OTLs production processes involves producing a
Teflon spacer plate that has a tolerance of 0.05 to 0.100 cm in thickness. On the
recommendation of the quality assurance (QA) department and over objections of the
plant manager, OTL just purchased some new equipment to make these parts. Recently,
the production manager was receiving complaints from customers about high levels of
nonconforming parts. He suspected the new equipment, but neither QA nor plant
management would listen.
The manager discussed the issue with one of his production supervisors who mentioned
that she had just collected some process data for a study that the quality assurance
department was undertaking. The manager decided that he would prove his point by
showing that the new equipment was not capable of meeting the specifications. The data
48
provided by the supervisor are shown in the C12Data file for Problem 12-37 on the
Premium website for this chapter. Perform a process capability study on these data and
interpret your results.
Answer
37.
Omega Technology Ltd.s process capability results from the Excel spreadsheet software
are shown below. (See spreadsheet Prob12-37.xls for details.)
Average
Standard deviation
0.0764
0.0104
Cp
Cpl
Cpu
Cpk
0.8019
0.8468
0.7569
0.7569
These data show that the process has a rather low overall capability,
with Cp = 0.8019 and a total of 1.71% of the values falling outside of
the specification limits of 0.05 - 0.10
Process statistics: x = 0.0764, = 0.0104
z = 0.10 - 0.0764 = 2.27
the part
0.0104
z =
0.05 - 0.0764 = -2.54
that the part
0.0104
limit
49
Answer
38.
1 [(mean target ) 2 / 2 ]
= 0.903 /
0.903
Cpu = UTL - x
3
Cpl =
x-
Conclusion: The process is centered on the mean, but it does not have
adequate
capability at this time.
Cpk= min (Cpl , Cpu ) = 0.903
(b)
Cp =
changed.
= 23; = 1.2
UTL -LTL =
6
Cmodified = Cp /
6 (1.2)
1 [( mean target ) 2 / 2 ]
= 0.903 /
0.584
Because of the shift away from the target, capability is lower.
Cpu = UTL - x
3
0.347
Cpl = x - LTL =
3
28.25 - 23.0
3 (1.2)
23.0 - 21.75
= 1.458
3 (1.2)
Conclusion: The process is skewed and still does not have adequate
capability at this time.
(c)
2
new
new = 0.759
Cp =
50
UTL -LTL =
6
Cmodified = Cp /
1 [( mean target ) 2 / 2 ]
= 1.427/
1.427
If there is no shift away from the target, capability is equal to Cp.
Cpu = 28.25 - 25.0
3 (0.759)
= 1.427
A process has upper and lower tolerance limits of 5.80 and 5.00, respectively. If the
customer requires a demonstrated Cp of 2.0, what must the standard deviation be? If both
Cpu and Cpl must also be 2.0, determine the process mean, using that calculated standard
deviation, assuming a normal distribution of output.
Answer
39.
3
40.
5.80 - 5.00 =
0.8 ; Therefore, = 0.0667
6
6
5.80 - x
3
- LTL =
Clearly demonstrate that Six Sigma requires Cp = 2.0 and Cpk = 1.5.
Answer
51
40.
In many cases, controlling the process to the target is less expensive than reducing the
process variability. This table can help assess these trade-offs.
The sigma level can easily be calculated on an Excel spreadsheet using the formula:
=NORMSINV(1-Number of Defects/Number of Opportunities) + SHIFT
52
or equivalently,
=NORMSINV(1-dpmo/1,000,000) + SHIFT
SHIFT refers to the off-centering as used in Table 11.1. Using the airline example
discussed earlier, if we had 3 lost bags for 8000(1.6) = 12,800 opportunities, we would
find =NORMSINV(1-3/12800) + 1.5 = 4.99828 or about a 5-sigma level.
Using data from problem 40, above, we can show that
Cp = UTL -LTL = 2.0 =
6
Cpu = UTL - x
3
Cpl =
5.80 - 5.00 =
0.8 ; Therefore, = 0.0667
6
6
5.80 - x
3
- LTL =
1 [( mean target ) 2 / 2 ]
= 2.0 /
53
Speed of checkout
Lowest available price
Understanding/willingness of
personnel to solve problems
Time, dates, Internet, phone
Process standards
Fees
Error prevention
Understanding/willingness of
personnel to solve problems
Speed - check-in
Amenities
Costs
Accuracy
54
4.
customer
attributes
and
technical
Technical Requirements
"Design for assembly";
Simple
"Launch" mechanism
Wing, tail, body design
Quality of wood
Price/durability ratio
The best way to prioritize the voice of the customer would be to have a focus group of
typical customers, such as craftspeople, "do-it-yourselfer's", hobbyists to provide input on
how they used the screwdriver and their priorities. Below is a possible configuration of the
matrix, with priorities for a serious craftsperson. Such a person would look for quality and
functionality over price or extra features, such as ratchets or interchangeable bits.
Price
Interchg
Bits
Steel
Shaft
Easy to use
Ratchet
Capabil.
Durable
Comfortable
Plastic
Handle
Versatile
Rubber
Grip
Inexpensive
Priority
55
= Strong relationship
= Weak relationship
5.
6.
7.
Answers will vary, depending on the individual websites and topics
chosen by the students.
8.
The answer to the question of whether students agree or disagree with the relative
importance rankings obtained from the study of the RRC at Tennessee Tech ultimately
depends on students opinions. However, a strong case might be made that the relative
importance score would depend on the situation. For small, rush, duplicating jobs, prompt
service would seem to be of greatest importance. For research jobs where specific
information has to be found, knowledge and courtesy of the employees would be highly
desirable, as well as accuracy, which might be a close second in importance. For the
inexperienced user, such as a freshman student, empathy and willingness to help would
possibly be ranked as the two highest criteria.
2.
Concentrating on the top four characteristics, the following weighted scores can be
calculated:
Resources (personnel)
Customer handling
Information handling
135
68
87
56
68
The three areas on which the analysts focused were document handling, training, and
layout. The above weighted scores would seem to lend little support to the need to deploy
a new document handling process (45 point score), nor to improve the layout (6 point
score), which have very little impact on customer quality criteria. Training may be
required, but the focus on document handling would seem to be unnecessary.
3.
Given the high ranking of resources (personnel), it appears that more attention should be
paid to selection and retention issues. Information handling, in second place, also has a
major impact, with customer handling, and attitudes and morale tied for third place. These
categories could be improved by training and by process analysis to determine if the best
processes were being used. As a result, it could be predicted that morale and customer
satisfaction would likely increase.
The next steps would include gathering data using the checklist form that the committee
designed. The committee might also want to develop process flow charts, while waiting
for the fall data to be gathered and analyzed.
2.
The data from the checklist should be put into a format, perhaps in a spreadsheet, where it
could be analyzed. Analysis tools might include Pareto analysis and histograms.
Segmentation should also be used to find out the incidence of falls in likely locations.
3.
Improved processes and systems should be developed based on analysis of the checklist
and process charting. The significant few causes, based on the Pareto analysis, should be
addressed first, in order to achieve the greatest immediate impact. Analysis of the process
flow chart could reveal places where processes could be simplified, and might also identify
conditions that would contribute to patient falls, so they could be eliminated.
Bonus Materials
Case - Hydraulic Lift Company
1.
The key to the calculation of an estimated process capability for this case is to calculate an
estimated standard deviation for each condition. Using the simplifying assumption that the
sample standard deviation is a good approximation of the population standard deviation will
allow us to make a reasonable estimate, even though for the cases of the small sample sizes
of 30 or 35 that assumption would be open to argument by statisticians.
We will concentrate on the calculation of Cp for only case (a) and (e), since it is obvious that
the capability became drastically worse during the experimental stages from (b) to (d).
Reading the data from the histograms, we can use the calculation of the sample standard
57
deviation with grouped data from the chapter. The frequency histogram for condition (a)
shows:
mp,
Group x
Frequency
fx
fx2
_________________________________________________
1
45
3
135
6075
2
50
6
300
15000
3
55
0
0
0
4
60
16
960
57600
5
65
4
260
16900
6
70
22
1540
107800
7
75
6
450
33750
8
80
23
1840
147200
9
85
5
425
36125
10
90
10
900
81000
11
95
0
0
0
12
100
5
500
50000
7310
551450
fx
7310
x = n = 100 = 73.1
s=
[ fx 2 / ( n 1) [( ( fx) 2 / n) / (n - 1)]
[551450 / 99] [( 7310) 2 / 100 / 99]
= 13.138
2620
s=
58
n = 35 = 74.857
[ fx 2 / ( n 1) [( ( fx) 2 / n) / (n - 1)]
[197450 / 34] [(2620) 2 / 35 / 34]
=
= 6.241
The process used here was obviously a systematic process of problem solving similar to the
one suggested in this chapter. The first step was a) to understand the "mess." A Pareto-like
approach found that 50% of the defective items were due to dimensional problems on one
diameter of the valve stem. Next, b) find facts on the process capability; c) specific problems
were identified: over adjustment by the operator, and lack of ability of the machine to hold
tolerances; d) ideas on machine adjustments and improvements were generated; e) solutions
were implemented, with the machine being adjusted and later overhauled; f) as Deming said,
"Do it over, again and again." [Step 7, added to the process, is continuous improvement].
See spreadsheet 12blomrrcase.xls for details. Note that there are some rounding errors
below that make answers on the spreadsheet appear slightly different.
Calculations for the repeatability and reproducibility (R&R) study are as follows:
x 1 = (Mijk) /nr = 2.3378 / 30 = 0.0779
R 1 = (Rij) / n =
0.1030/ 15 = 0.0069
59
- 0.0000077
(EV) 2 + (OV) 2
(0.0301) 2 + (0) 2
= 0.0301
= 60.2%
= 0%
= 60.2%
TV=
(RR) 2 + (PV) 2
PV = Rp K3
Rp = Range of part averages for the entire sample: 0.1013 to 0.0516 = 0.0497
K3 = 1.45 from Table 11.5
PV = (0.0497) (1.45) = 0.0721
Thus TV =
(RR) 2 + (PV) 2
(0.0301) 2 + (0.0721) 2
= 0.0781
%OV = 0
%EV = % RR, related to TV = 100 (0.0301 / 0.0781) = 38.5
%PV, related to TV = 100 (0.0721 / 0.0781) = 92.3
NOTE: The sum of the above percentages will not add to 100.
Based on the "rules" for process capability given in the text, it can be assumed that the
equipment and the process need to be improved, since none of the percentages fall below
the 30% or 10% minimums. The operators are consistent in their measurements, so their
methods are not in question at this point.
Worn or faulty gauges should be discarded and the rest should be calibrated.
Case - The PIVOT Initiative Part II
2.
3.
60