Sie sind auf Seite 1von 16

LEAN SIX SIGMA

What Does Lean Six Sigma Mean for Services?


It is a business improvement methodology that maximizes share holder value by
achieving the fastest rate of improvement in customer satisfaction, cost, quality,
process speed and invested capital.
The fusion of both Lean and six sigma enables the reduction of the cost of
complexity.
Lean manufacturing or lean production, which is often known simply as "Lean", is a
production practice that considers the expenditure of resources for any goal other
than the creation of value for the end customer to be wasteful, and thus a target for
elimination.
Six Sigma:
1. Emphasizes the need to recognize opportunities and eliminate defects as
defined by customer
2. Recognizes that variation hinders our ability to reliably deliver high quality
services
3. Requires data driven decisions and incorporates a comprehensive set of
quality tools under a powerful frame work for effective problem solving
4. Provides a highly prescriptive cultural infrastructure effective in obtaining
sustainable results
Lean:
1. Focuses on maximizing process velocity
2. Provides tools for analyzing process flow and delay times at each activity in a
process
3. Centers on the separation of value-added from non value-added work with
tools to eliminate the root causes of non-value-added activities and their cost.
4. Provides a means for qualifying and eliminating the cost of complexity
The two methodologies interact and reinforce one another, such that percentage
gains in return on investment capital (ROIC%) is much faster if both are
implemented together.

HOW SPEED AND QUALITY ARE LINKED?


Approximately 30-50% of the cost in a service organization is caused by costs
related to slow speed or performing rework to satisfy customer needs.
A fast and responsive process is capable of achieving the highest levels of quality
and that only a high quality process can sustain high velocity.
Only Lean + six sigma = Low cost

5 Lean Tools and Principles to Integrate into Six Sigma

ncreasingly, organizations that use Six Sigma are making an effort to


integrate Lean into their existing process-improvement framework. For many,
combining Six Sigma’s focus on process quality and Lean's emphasis on turn-around
time results in more high-impact, quick-hit projects. To gain this advantage,
however, organizations must face a difficult obstacle: integrating Lean without
creating ripples in the existing Six Sigma structure. If the Lean introduction is not
done properly, it can lead to more pitfalls than successes.
With a structured approach, though, it is possible to merge Lean into a mature Six
Sigma framework, as was experienced by a business unit of a Fortune 10 company.
During a Work-out, the unit evaluated the various principles of Lean to determine
which could be subtly introduced and used effectively to augment the existing Six
Sigma framework. They found that five Lean tools and principles were particularly
applicable:
1. Value Stream Mapping
In the Analyze phase of a DMAIC project, a value stream map can be created that
shows the flow of materials and information, and categorizes activities into three
segments: value enabling, value adding and non value adding. The focus of this tool
is on identifying and eliminating the non-value added activities in each process step
and reducing the wait time between consecutive steps wherever possible. Value
enabling activities, however, cannot be totally eliminated from a system. Instead,
they can be sub-classified into value adding and non-value adding activities,
allowing those value enabling activities that are non-valued added to be eliminated.
These eliminations help make a process more compact – a benefit in process
improvement projects aimed at reducing variation. This tool also can be a part of
a Kaizen cycle, incorporated within the Analyze and Improve phases.
An example of how the company used value stream mapping: In a digitized process
under study, the value stream map demonstrated that the workflow went to the
same approver step twice – without any value addition from the previous step to
benefit the approver at the later step. Also, the subsequent steps were not
dependent on the second approval. Hence, the second approval did not add any
value to the process – and it was eliminated from the workflow.
2. Takt Time
Takt is a German word that can be roughly translated as "beat." Takt time is the rate
at which a completed project needs to be finished in order to meet customer
demand. For processes involving cycle times, such as manufacturing or incident
management, the as-is cycle time can be captured in the Measure phase. Then,
during the Analyze phase, the cycle time can be compared with existing service
level agreements (SLAs). If a mismatch exceeds the tolerance, improvements would
be needed to match the cycle time with the takt time for the system.
For instance, an incident-management tool was studied that had a significant
number of cases missing their SLAs. The study revealed that the tool, which had
two basic stages for providing the resolution, always missed the SLA in the second
stage. The resolution time for the case was measured as the end-to-end resolution –
resulting in most of the SLA period elapsing in the first stage and little time
remaining for the second stage. To resolve this, the SLA for the case was split into
different components for the two stages. This helped distribute the total SLA time
among the two stages so the slippage could be monitored individually.
3. Ishikawa (Cause-and-Effect) Diagram and 5 Whys
In the Analyze phase, the absence of concrete statistical data sometimes can
make the identification of a root cause difficult. In those scenarios, the 5 Whys –
asking "Why?" five times – along with a cause-and-effect diagram, can make the
task more manageable. The 5 Why's tool also can help uncover the process
dynamics and the areas that can be addressed easily.
4. Heijunka (Load Balancing)
A Japanese term, Heijunka refers to a system of production designed to provide a
more even and consistent flow of work. This principle can be incorporated in the
Design phase if the root cause analysis during Analyze points to bottlenecks in the
process. Load balancing can be used to introduce a pull in the system rather than
letting it operate on push – thus alleviating the bottlenecks. Efforts for introducing a
level load balance in the system also automatically reduce inventory. If takt time
principles are used while designing the system, it would help ensure a level load
balance.
5. Poka Yoke (Mistake Proofing)
A Japanese phrase meaning mistake proofing, poka yoke can be used to tune
process steps and also when designing a new system
altogether with DMADV(Define, Measure, Analyze, Design, Verify). A combination of
an Ishikawa chart and Pareto analysis can be useful in Analyze in listing the major
issues plaguing the as-is process. During the Improve and Design phases, the
possibilities for eliminating a major cause of errors can be explored by improving or
redesigning the system to avoid error-inducing scenarios.
An example of poka yoke in action: A large amount of workflows in a payroll process
were being terminated abruptly. Users were provided with a standard set of action
buttons for each step: "Approve to Next" and "Approve to Close." The former
approved the step and sent the workflow forward, while the latter approved and
closed the workflow. The cause for the high number of terminations was the
confusing nomenclature on the buttons. The issue was resolved by providing
mouse-over texts on both the buttons clearly labeling the scenarios when each
should be used.
The Next Steps
The Work-out team went on to formulate a roadmap to introduce Lean Six Sigma
(LSS) by starting with a push and gradually transitioning into an induced pull. The
key milestones in the roadmap:
• Identify initiatives to be executed using LSS by dedicated Black Belts and
showcased organization-wide on completion.
• Hold LSS awareness sessions for all associates. Create and make available
LSS training materials for associates.
• Assign Green Belts projects that require them to use applicable Lean tools as
part of the Six Sigma rigor. Recognize the best-executed LSS projects.
Based on the action items from the Work-out, the team also modified
the storyboard, which was previously modeled purely on the Six Sigma approach for
process improvement, to include Lean tools and principles to facilitate execution of
LSS projects. The new system also was subject to continuous analysis and
evaluations with a view for further improvements. When the possibility for
improvements in key areas arose, the team took them up as Kaizen events.
Through the initial push from the leadership team, combined with learning aids on
Lean tools, the LSS approach was widely accepted throughout the organization. This
boosted both the tangible benefits and the turn-around time for process-
improvement projects at the company.

What is
Lean?
Lean is a methodology that is used to accelerate the velocity and reduce the cost of
any process (be it service or manufacturing) by removing waste. Lean is founded on
a mathematical result known as Little's Law:
Quantity Of Things In
Lead Process
Time Of
Any = Average Completion
Process Rate/Unit Of Time

The lead-time is the amount of time taken between the entry of work into a process
(which may consist of many activities) to the time the work exits the process. In
procurement the Things in Process are the number of requisitions, in product
development the number of Projects In Process, and in manufacturing the amount
of Work In Process. Lean contains a well-defined set of tools that are used to control
and then reduce the number of Things in Process, thus eliminating the non-value
add cost driven by those Things in Process. The Pull/Kanban system puts a cap on
the number of things in process, thus putting a cap on the lead-time. Lean also
contains tools to reduce the quantity of things in process including setup reduction,
total productive maintenance, 5S, etc. For example, setup reduction allows the
reduction of the time spent on producing a quantity of any given offering or
product, reducing lead-time without reducing the completion rate. The Lean
methodology has a bias for action, leveraging Kaizen to rapidly improve processes
and drive results.

Q: Why should Lean be important to Six


Sigma professionals?

A: Whereas Six Sigma is most closely associated with defects and quality, Lean is
linked to speed, efficiency, and waste. Lean provides tools to reduce lead-time of
any process and eliminate non-value add cost. Six Sigma does not contain any tools
to control lead time (e.g., Pull systems), or tools specific to the reduction of lead
time (e.g., setup reduction). Since companies must become more responsive to
changing customer needs, faster lead times are essential in all endeavors. Lean is
an important complement to Six Sigma and fits well within the Six
Sigma DMAIC process. Additionally, the Lean Kaizen approach is a great method
that can be used to accelerate the rate of improvements.

You need to improve quality so you can achieve maximum speed, and you
need to do the things that allow maximum speed in order to reach the
highest sigma levels of quality. In other words, you need both Lean (speed) and
Six Sigma (quality) principles and tools to drive improvements in ROIC and achieve
the best competitive position.
Q: Can you provide an example of how Lean coupled with Six Sigma would
help address a transactional process issue? A manufacturing process
issue?

A: The processes of all companies and organizations must:


1. Become faster and more responsive to customers
2. Achieve Six Sigma capability
3. Operate at world class cost
Only the combination of Six Sigma and Lean can fulfill all three goals. In any
process, Lean Six Sigma creates a value stream map of the process identifying
value add and non-value add costs, and captures the Voice of the customer to
define the customer Critical To Quality issues. Projects within the process are then
prioritized based on the delay time they inject. This prioritization process inevitably
pinpoints activities with high defect rates (Six Sigma tools) or long setups,
downtime (Lean tools). In manufacturing, a further benefit results from a reduction
in working capital and capital expenditure. We have found over the last 15 years
that these methods apply in virtually every kind of process from healthcare to
financial services to energy to manufacturing.

Q: What role can Lean play in a company that has already started
implementing Six Sigma?

A: Lean will add another dimension of improvement in process speed and reduction
of non-value add cost. Further, by accelerating process speed, Lean provides faster
feedback and more cycles of learning enhancing the power of Six Sigma tools. For
example, an L18 Design of Experiment might require about 100 separate runs to
optimize parameters and minimize variation. Reducing the lead time by 80% will
allow the fractional factorial design to be completed five times faster. In addition,
the Lean Kaizen approach allows Black Belts to implement rapid improvements
whenever possible.

Six Sigma Principles

Six Sigma Principles allow us to reduce variation in performance up front in the


design. Applying Six Sigma Principles in the initial design stages of any product,
process or service and using a refined set of evaluation tools, DFSS incorporates the
highest quality at the point of introduction. Six Sigma Principles work not just in
manufacturing and electronics, where Six Sigma originated, but in any business,
from banking and financial services to chemicals, pharmaceuticals, utilities, health
care-even the entertainment industry.

Six Sigma Principles can be successfully applied to all situations:


• Design
• Manufacturing
• Finance
• Administration etc.

The Six Sigma Principles are being used in nearly every industry to reduce waste
and improve performance. Six Sigma Principles include customer focus and
management by fact using statistical methods. Motorola applied Six Sigma
Principles in publication, sales, corporate accounting and legal departments. The
fundamental principles of Six Sigma Deployment are to assist the reader in solving
problems ... efficiently and effectively.

Six Sigma Principles will resonate with those who believe legal professionals should
bear some responsibility for their client's bottom-line success, with those who think
that legal services, no less than other services, can improve through process
analysis, with those who agree that process is not something to create anew every
time a new lawsuit or commercial transaction surface, and with those driven by a
commitment to continuous improvement and who recognize that the complete
lawyer brings more to the table than legal acumen.

Six Sigma Principles help companies achieve higher levels of quality by virtually
eliminating defects. The Six Sigma Principles and its methodology to drive
continuous improvement provide a very useful framework for developing high
quality competitive.

The main principles of Six Sigma or Six Sigma Principles are:


• Customer must benefit in a way they understand and value
• Managers must direct and lead Six Sigma, they decide on targets and
projects
• The targets must be significant and the payback should be clear - up-front
• The whole process should be based on measurable facts
• Six sigma based on improving the system not the people working within the
system

The key Six Sigma Principles are:


Clearly defining the task
Reducing variation
Understanding that product quality will always give competitive
advantage
Measuring and proving improvement

Fundamental Six Sigma Principles


The fundamental principle of Six Sigma is to "satisfy customer
requirements profitably".
A fundamental principle of six sigma is reduction of variability. The
tendency is to apply this principle to everything.
The fundamental principle of Six Sigma methodology is to solve the right
problem the right way. To do this, two important issues need to be
addressed. One is to prioritize the selection of target processes for
improvement; the other is to choose a proper solution strategy.
The central principle of Six Sigma is that by measuring the defects a
process produces, one can systematically identify and remove sources of
error, so as to approach the ideal state of no defects at all.
The fundamental principle of Six Sigma management is that if an
organization can measure the defects in a process, its senior management
can systematically determine ways to eliminate them, to approach quality
levels of zero defects.
The basic principle of Six Sigma is to use a set of methodologies and
techniques that aim to achieve high levels of quality and reduce cost levels.
Origin and meaning of the term "six sigma process"
The following outlines the statistical background of the term Six Sigma:Sigma (the
lower-case Greek letter σ) is used to represent the standard deviation (a measure of
variation) of a statistical population. The term "six sigma process," comes from the
notion that if one has six standard deviations between the mean of a process and
the nearest specification limit, there will be practically no items that fail to meet the
specifications.[5] This is based on the calculation method employed in aprocess
capability study.
In a capability study, the number of standard deviations between the process mean
and the nearest specification limit is given in sigma units. As process standard
deviation goes up, or the mean of the process moves away from the center of the
tolerance, fewer standard deviations will fit between the mean and the nearest
specification limit, decreasing the sigma number.[5]
The role of the 1.5 sigma shift
Experience has shown that in the long term, processes usually do not perform as
well as they do in the short.[5] As a result, the number of sigmas that will fit between
the process mean and the nearest specification limit is likely to drop over time,
compared to an initial short-term study.[5] To account for this real-life increase in
process variation over time, an empirically-based 1.5 sigma shift is introduced into
the calculation.[10][5] According to this idea, a process that fits six sigmas between
the process mean and the nearest specification limit in a short-term study will in the
long term only fit 4.5 sigmas – either because the process mean will move over
time, or because the long-term standard deviation of the process will be greater
than that observed in the short term, or both.[5]
Hence the widely accepted definition of a six sigma process is one that produces 3.4
defective parts per million opportunities (DPMO). This is based on the fact that a
process that is normally distributed will have 3.4 parts per million beyond a point
that is 4.5 standard deviations above or below the mean (one-sided capability
study).[5] So the 3.4 DPMO of a "Six Sigma" process in fact corresponds to 4.5
sigmas, namely 6 sigmas minus the 1.5 sigma shift introduced to account for long-
term variation.[5] This is designed to prevent underestimation of the defect levels
likely to be encountered in real-life operation.[5]

Sigma levels
Taking the 1.5 sigma shift into account, short-term sigma levels correspond to the
following long-term DPMO values (one-sided):
 One Sigma = 690,000 DPMO = 31% efficiency
 Two Sigma = 308,000 DPMO = 69.2% efficiency
 Three Sigma = 66,800 DPMO = 93.32% efficiency
 Four Sigma = 6,210 DPMO = 99.379% efficiency
 Five Sigma = 230 DPMO = 99.977% efficiency
 Six Sigma = 3.4 DPMO = 99.9997% efficiency
Methods
Six Sigma has two key methods: DMAIC and DMADV, both inspired
by Deming's Plan-Do-Check-Act Cycle.[9] DMAIC is used to improve an existing
business process; DMADV is used to create new product or process designs.[9]
DMAIC
The basic method consists of the following five steps:
 Define process improvement goals that are consistent with customer demands
and the enterprise strategy.
 Measure key aspects of the current process and collect relevant data.
 Analyze the data to verify cause-and-effect relationships. Determine what the
relationships are, and attempt to ensure that all factors have been considered.
 Improve or optimize the process based upon data analysis using techniques
like Design of experiments.
 Control to ensure that any deviations from target are corrected before they
result in defects. Set up pilot runs to establish process capability, move on to
production, set up control mechanisms and continuously monitor the process.
DMADV
The basic method consists of the following five steps:
 Define design goals that are consistent with customer demands and the
enterprise strategy.
 Measure and identify CTQs (characteristics that are Critical To Quality), product
capabilities, production process capability, and risks.
 Analyze to develop and design alternatives, create a high-level design and
evaluate design capability to select the best design.
 Design details, optimize the design, and plan for design verification. This phase
may require simulations.
 Verify the design, set up pilot runs, implement the production process and hand
it over to the process owners.
DMADV is also known as DFSS, an abbreviation of "Design For Six Sigma".[9]
Implementation roles
One of the key innovations of Six Sigma is the professionalizing of quality
management functions. Prior to Six Sigma, quality management in practice was
largely relegated to the production floor and to statisticians in a separate quality
department. Six Sigma borrows martial arts ranking terminology to define a
hierarchy (and career path) that cuts across all business functions and a promotion
path straight into the executive suite.
Six Sigma identifies several key roles for its successful implementation.[11]
 Executive Leadership includes the CEO and other members of top management.
They are responsible for setting up a vision for Six Sigma implementation. They
also empower the other role holders with the freedom and resources to explore
new ideas for breakthrough improvements.
 Champions are responsible for Six Sigma implementation across the organization
in an integrated manner. The Executive Leadership draws them from upper
management. Champions also act as mentors to Black Belts.
 Master Black Belts, identified by champions, act as in-house coaches on Six
Sigma. They devote 100% of their time to Six Sigma. They assist champions and
guide Black Belts and Green Belts. Apart from statistical tasks, their time is spent
on ensuring consistent application of Six Sigma across various functions and
departments.
 Black Belts operate under Master Black Belts to apply Six Sigma methodology to
specific projects. They devote 100% of their time to Six Sigma. They primarily
focus on Six Sigma project execution, whereas Champions and Master Black
Belts focus on identifying projects/functions for Six Sigma.
 Green Belts are the employees who take up Six Sigma implementation along
with their other job responsibilities. They operate under the guidance of Black
Belts.
Quality management tools and methodologies used in Six Sigma
Six Sigma makes use of a great number of established quality management
methods that are also used outside of Six Sigma. The following table shows an
overview of the main methods used.
 5 Whys  Failure mode and effects analysis
 Analysis of variance  General linear model
 ANOVA Gauge R&R  Histograms
 Axiomatic design  Homoscedasticity
 Business Process Mapping  Pareto chart
 Catapult exercise on variability  Pick chart
 Cause & effects diagram (also  Process capability
known as fishbone or Ishikawa  Regression analysis
diagram)  Root cause analysis
 Chi-square test of independence  Run charts
and fits  SIPOC analysis
 Control chart (Suppliers, Inputs, Process, Output
 Correlation s, Customers)
 Cost-benefit analysis  Stratification
 CTQ tree  Taguchi methods
 Quantitative marketing  Thought process map
research through use of Enterprise  TRIZ
Feedback Management (EFM)
systems
 Design of experiments

The pharmaceutical industry's recent emphasis on continuous improvement,


operational excellence, and process analytical technology has motivated us to
evaluate the basic tenets of our approach to quality. Historically, the ability to
ensure that a drug meets its intended form, fit, and function has been achieved
through the application of the quality infrastructure, i.e., standard operating
procedures, policies, specifications; qualification or validation, i.e., commissioning,
installation qualification (IQ), operational qualification (OQ), performance
qualification (PQ), process validation; and testing, i.e., in-process and final release.
However, despite these processes, the number of drug recalls continues to rise,
escalating from 176 in 1998 to 354 in 2002, according to the US Center for Drug
Evaluation and Research.1

The use of regulations as a primary means of ensuring product quality began to


decline in early 2000, when industry pushed back on FDA's Part 11 compliance
requirements for electronic signatures and electronic data exchange, challenging
the cost and effort associated with implementation, versus the actual benefit to
product quality. Today, however, industry and regulatory agencies are moving
toward a morescientific approach to ensuring product quality.

The International Conference on Harmonization (ICH) Q8 and Q9 guidance


documents2,3 , for example, define a scientific approach to process
characterization, advocating a quality by design framework. Risk management is an
integral part of this approach.

Similarly, the US FDA's "GMPs for the Twenty-First Century" initiative focused on
quality by design, risk management, continuous process improvement, and quality
systems. Rolled out in 2004, this initiative challenged industry's traditional
approaches to ensuring product quality by encouraging employees to look beyond
traditional inspection methodologies for ensuring product performance. The early
process and product characterization emphasized in the quality-by-design and risk-
management approaches do not inherently conflict with validation. On the contrary,
by deepening the level of scientific understanding of a manufacturing process, the
approaches ensure that a process is well understood before it is considered
"validated." Methods that involve continuous improvement and real-time control,
however, do pose a significant question: Are these quality methods inconsistent
with the basic tenets of validation that have served as the backbone of the
industry's quality structure for so many years? Once you have "validated" a
manufacturing process, how much can you improve it—through real-time control or
any sort of continuous improvement step incorporated into Lean, Six Sigma, etc.—
without having to file manufacturing supplements with FDA? How much of an
impediment are those filing requirements?

THE VALIDATION PARADIGM

The challenge of validation is that it has been viewed as a necessary evil—a


regulatory activity that cannot be avoided when manufacturing regulated products.
The effort and cost associated with validation continue to escalate as industry and
regulatory groups increase their understanding of pharmaceutical processes and
identify an increasing number of process variables that must be controlled. Biotech
adds another layer of complexity by introducing the qualified pilot or intermediate-
scale model as an integral component of the validation equation.4

The prohibitive cost of characterization studies at full scale requires us to establish


clear, scientific arguments to show how process development studies relate to full-
scale validation lots. The complexity of biotech processes demands an even higher
level of scientific argument. As we increase our understanding of biopharmaceutical
processing, the value associated with traditional validation diminishes, and industry
responds accordingly.

The integration of equipment validation and process validation provided incentive to


measure the capability of our processes and analytical methods. However,
somewhere along the way, the incentive for validation shifted from a need to
measure processes, to a need to satisfy a regulatory requirement as quickly and as
cheaply as possible.
Over time, industry came to believe that validation had to include a broader range
of equipment and processes and a greater level of detail, and as a result, validation
costs went up. In response, the industry attempted to distribute the responsibility
for validation among participants in the quality process. For example, industry
suddenly decided that validation had to include commissioning activities and
engineering pre-cursor activity to equipment qualification, so they started requiring
that contractors and subcontractors test and document various aspects of IQ. The
approach of requiring increased involvement from vendors also extended to factory
acceptance tests. Such tests—which have ranged from simple vendor testing and
certification to constructing simulator panels to mimic the actuation of automated
components—have also ranged in their true relevance to the validation process.

Market drivers completely unrelated to the field of validation often have determined
the amount of effort put into validation. For example, when equity markets dried up
in the late 1990s, emerging biotech companies shifted their emphasis from
scientific investigation to bringing product to market as quickly as possible. The
industry looked for cheaper and faster ways to push through the validation process
to move programs forward quickly. The result was simpler process validation studies
that focused on building three validation lots to demonstrate process predictability,
rather than focusing on true process understanding. Likewise, companies began
buying more equipment from suppliers who offered "canned" validation protocols
that could be purchased and implemented, rather than developing their own
protocols to challenge the equipment and thus increase the probability the
equipment would meet the needs of the process. The implication of these shifts was
that validation was necessary, but not essential to sound process development.

This short-cut approach to validation resulted in processes that were less stable at
the commercial scale. FDA's recent revelations about high-profile, approved
products that may be unsafe, such as Vioxx and Serevent, and Congress's pressure
on industry to find ways to reduce the cost of drugs to the general public, have
impacted both Big Pharma and biotech. In response, the industry has recognized
the need for a better way to reduce process and product risk.

The answer was a shift to a more scientifically driven development approach, often
referred to as "Operational Excellence," or "Process Excellence." This approach
integrates process, quality, and business requirements to promote the science of
development.

These quality initiatives integrate Six Sigma, Lean Manufacturing, Kepner-Tregoe,


Theory of Constraints, Design of Experiments, and Balanced Scorecards to establish
process understanding. These methodologies emphasize the need to objectively
define, measure, and characterize critical variables that affect a process. While
testing and data collection are integral components, verification is the final
culmination of the quality assessment—not the basis of quality.

Looking closely at these approaches, however, reveals that they based in a large
part upon an approach that has been integral to our quality systems for over 70
years—Walter Shewhart's cycle of Plan, Do, Check, Act (PDCA).

PLAN, DO, CHECK, ACT


Figure 1. PDCA "The Shewhart Cycle"

Walter Shewhart, an enterprising statistician who worked at Bell Laboratories in the


US during the 1930s, developed the science of Statistical Process Control. An
offshoot was the PDCA Cycle, often referred to as "the Shewhart Cycle." This tool
was adopted and promoted from the 1950s on, by W. Edwards Deming, the
renowned quality management authority, and as a result the tool also became
known as "the Deming Wheel" (Figure 1).

The PDCA Cycle was the first tool broadly adopted as a framework for continuous
improvement. PDCA is a four-step quality improvement cycle that promotes
continuous improvement based on the method of design (plan), execution (do),
analysis (check), and evaluation (act). Sometimes referred to as plan/do/study/act,
the cycle emphasizes the constant attention and reaction to factors that affect
quality.

The chief advantage of the PDCA cycle—flexibility in moving through each phase of
the cycle—is also its biggest challenge, because it left the door open for subjectivity.
Subjectivity has long been the downfall of our industry. Without a clear vision for
success or a defined method for evaluation, the potential exists to rely on
unscientific process development and characterization activities, which can lead to
incorrect or incomplete conclusions. For example, univariate analysis methods—
often called One-Factor-at-a-Time (OFAT) studies5 —have been the backbone of the
small-molecule pharma industry, as well as the biopharm industry. Such studies,
however, do not possess the power to fully characterize a process. The result is a
false sense of security that the process characteristics are understood.

Figure 2. Cube Plot for Protein Recovery

An analogy would be that of trying to solve the popular "Rubik's Cube" puzzle. It
may be relatively simple to get one side of the cube all one color, thus providing the
impression of progress towards your goal. However, the reality is that you are
actually further from success than when you started the exercise (Figure 2).
Because of these limitations, other industries abandoned the OFAT approach 30
years ago, deeming it ineffective for process characterization and verification.

The biopharmaceutical industry, too, has come to recognize that the OFAT approach
is insufficient. The industry has also realized that to be successful in combining
quality, technical, and business requirements in the drug development lifecycle, it
must realign not only its scientific approach to process understanding, but also its
thinking within the organization. As a result, Operation Excellence initiatives have
moved to frameworks such as Six Sigma to provide a roadmap that can meet this
need.

SIX SIGMA AND ITS ROADMAP


Figure 3. Six Sigma as an organizational development and leadership tool

In 1986, Motorola established a framework designed to integrate quality, process,


and business requirements into the product development lifecycle. Motorola
recognized that variation is the death knell of any process, so the company set out
to establish a methodology to identify and eliminate variation. They called this
approach Six Sigma6 (Figure 3).

Figure 4. The DMAIC Roadmap

In the late 1990s, CEOs Jack Welch from GE and Larry Bossidy from Allied Signal
adapted the Motorola model to a set methodology called the DMAIC roadmap.
DMAIC is an acronym for Define, Measure, Analyze, Improve and Control. These are
the five phases necessary to measure, characterize, and control a process (Figure
4).

Within each step of the road-map, a defined set of tools is applied. Each phase in
the DMAIC process is intended to guide the members of an improvement team
through the project in a manner that provides relevant data and in-depth process
understanding. The DMAIC project management approach allows businesses to
make the best possible decisions with the available data and resources. The five-
steps of the DMAIC process are as follows:

1. Define: Clearly define the problem and relate it to the customer's needs
(generally, with a cost benefit to the organization identified).

2. Measure: Measure what is key to the customer and know that the measurement
is good.

3. Analyze: Search for and identify the most likely root causes.

4. Improve: Determine the root causes and establish methods to control them.

5. Control: Monitor and make sure the problem does not come back.

Within each DMAIC phase, there is a set of deliverables that must be completed to
ensure all project requirements are met. A summary of the deliverables and typical
activities for each phase of the DMAIC process is shown in Table 1.

Looking closely at the tools within the DMAIC methodology reveals elements that
have been part of the quality toolkit since its inception. Cause and effect diagrams,
Failure Mode and Effects Analysis (FMEA), and process capability analysis, among
others, have been used broadly by process and quality engineers in multiple
industries for years. What separates the DMAIC roadmap from the isolated
application of these individual tools is the methodology around the application of
the tools. In DMAIC, the process evaluation is based on the objective acquisition and
analysis of data, in lieu of representative testing and inference.

LEAN MANUFACTURING
Although Six Sigma and the DMAIC toolkit focused on eliminating process variability,
there still remained the need to bring products to market faster and more cheaply.
As a result, the biopharmaceutical industry has turned to the principles of Lean
Manufacturing to increase the efficiency of our processes. The ideas of Lean
Manufacturing are based on the Toyota Production System approach of eliminating
waste in every aspect of a company's operation. Lean focuses on time variability, in
contrast to Six Sigma's focus on process variability. In their book Lean Thinking, Jim
Womack and Daniel Jones7 recast the principles of Lean into five principles:
1. Value: Every company needs to understand the value customers place on their
products and services. It is this value that determines how much money the
customer is willing to pay for them. This analysis leads to a top-down, target-costing
approach that has been used by Toyota and others for many years. Target costing
focuses on what the customer is willing to pay for certain products, features, and
services. From this, the required cost of these products and services can be
determined. It is the company's job to eliminate waste and cost from the business
processes so that the customer's price can be achieved at great profit to the
company. In the biopharmaceutical and pharmaceutical world, value is often
associated with quality and data, rather than with standard cost.

2. Value Stream: The value stream is the entire flow of a product's lifecycle, from
the origin of the raw materials used to make the product through to the customer's
cost of using, and ultimately disposing of, the product. Only by studying and
obtaining a clear understanding of the value stream (including its value-added and
waste) can a company truly understand the waste associated with the manufacture
and delivery of a product or service.

3. Flow: One significant key to the elimination of waste is flow. If the value chain
stops moving forward for any reason, then waste occurs. The trick is to create a
value stream in which the product (or its raw materials, components, or sub-
assemblies) never stops in the production process, because each aspect of
production and delivery is in harmony with the other elements. Carefully designed
flow across the entire value chain will minimize waste and increase value to the
customer. Achieving this kind of flow is a challenge in our industry because many of
our processes are batch processes. Even so, within the context of the total value
stream, there are significant opportunities to move towards continuous flow.

4. Pull: A traditional Western manufacturer uses a style of production planning and


control whereby production is "pushed" through the factory based upon a forecast
and a schedule. A pull approach dictates that we do not make anything until the
customer orders it. To achieve this requires great flexibility and very short cycle
times of design, production, and delivery of the products and services. It also
requires a mechanism for informing each step in the value chain what is required of
them today, based on customers' needs.

5. Perfection: A lean manufacturer sets perfection as a target. The idea of total


quality management is to systematically and continuously remove the root causes
of poor quality from the production processes so that the plant and its products
move toward perfection. This relentless pursuit of the perfect is the key attitude of
an organization that is "going for lean."
Lean has been enthusiastically embraced by our industry because the tools are
simple and improvement can be realized quickly. Although Lean is often initiated
because of cost or efficiency reasons, there is another perspective to Lean that is
often overlooked: quality.

Figure 5. DMAIC and Lean tools deployed in the Shewhart Cycle

Our industry should think of Lean as a quality initiative—not a business-driven one.


While it is true that the basis for Lean is to eliminate waste and maximize the value-
added activities of a process, another benefit of Lean is the way it simplifies and
standardizes the process. The result is improved predictability. If you map the
DMAIC and Lean tools together against the Shewhart PDCA Cycle, you find they
follow the same framework; the tools within both toolkits are designed to address
the same basic requirements of the PDCA cycle (Figure 5).

VALIDATIONANDPLAN,DO,CHECK,ACT

Table1a.SummaryofDMAICphasedeliverables(continued)

Mapping validation, as applied by the biopharmaceutical industry today, may seem


inconsistent with the principles of the Shewhart PDCA Cycle, DMAIC, and Lean. The
basis of traditional validation is verification against predetermined acceptance
criteria. How-ever, if we divide the validation process into its components, there is
more similarity than difference between validation and these improvement
methods. The steps of the validation life cycle map well to the Control, Measure,
and Analyze phases of the DMAIC roadmap. What is missing is the Improve stage.

Table1b.DMAICphasedeliverables

Six Sigma and Lean principles are predicated on the absolute requirements of
demonstrating that the process is in control. By building on an efficient and
objective framework for characterizing, measuring, and optimizing a process, it is
possible to achieve a level of confidence that the process will be predictable and
reproducible. No amount of testing will ever approach this level of confidence;
heightened testing and large sampling can still only infer the process is in control.
(As many have said, you cannot test quality into the product.) The irony in applying
validation to the PDCA model is that its efficacy is only as good as one's
understanding of the key process input variables that steer the process. In the
absence of this, validation degenerates to a paper exercise.

CONCLUSION

QuickRecap

The twenty-first century GMP initiative advocates the need for building process
understanding throughout the process development lifecycle. Tools such as Six
Sigma, DMAIC, and Lean Manufacturing provide a framework for objective
characterization and analysis of a process's key parameters. This knowledge,
coupled with a quality system framework for specification, in-process, and release
testing, can significantly elevate the level of quality built into the final product or
process. While at first glance validation might appear to be inconsistent with these
improvement initiatives, the elements of the validation lifecycle map to the control,
measure, and analysis phases of the PDCA lifecycle. The most effective application
of validation is achieved by using these optimization tools in the process
characterization and development phases of a process long before validation. Until
characterization and evaluation frameworks are more fully integrated into the drug
development lifecycle, validation will remain a costly and time-consuming exercise
capable only of providing limited assurance of process and product stability.

Das könnte Ihnen auch gefallen