Sie sind auf Seite 1von 62

2009 ASQ QMD Conference

CMMI High Maturity


Made Practical

Software Engineering Institute


Carnegie Mellon University
Pittsburgh, PA 15213

Robert W. Stoddard II
March 5, 2009

Robert Stoddard
© 2009 Carnegie Mellon University
NO WARRANTY
THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE
MATERIAL IS FURNISHED ON AN “AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY
MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO
ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR
PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM
USE OF THE MATERIAL.
MATERIAL CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY
WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT,
TRADEMARK, OR COPYRIGHT INFRINGEMENT.
Use of any trademarks in this presentation is not intended in any way to infringe on the
rights of the trademark holder.
holder
This Presentation may be reproduced in its entirety, without modification, and freely
distributed in written or electronic form without requesting formal permission. Permission is
required for any other use. Requests for permission should be directed to the Software
Engineering Institute at permission@sei.cmu.edu.
This work was created in the performance of Federal Government Contract Number
FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software
Engineering Institute, a federally funded research and development center. The
Government of the United States has a royalty-free government-purpose license to use,
duplicate or disclose the work,
duplicate, work in whole or in part and in any manner,
manner and to have or
permit others to do so, for government purposes pursuant to the copyright license under
the clause at 252.227-7013.

Robert Stoddard
© 2009 Carnegie Mellon University
2009 ASQ QMD Conference

Permission to Tool Screen Shots

Portions of the input and output contained in this module


manual are printed with permission of Oracle (formerly
D i i
Decisioneering).
i )
Crystal Ball 7.2.2 (Build 7.2.1333.0) is used to capture
screenshots in this module.
module
The Web page for Crystal Ball is available at
http://www crystalball com
http://www.crystalball.com

Portions
P ti off the
th input
i t and
d output
t t contained
t i d in
i this
thi
presentation are printed with permission of Minitab Inc.
using version 15
Mi it b company web
Minitab b page is
i htt
http://www.minitab.com
// i it b
Robert Stoddard
© 2009 Carnegie Mellon University 3
2009 ASQ QMD Conference

Topics

What is CMMI High Maturity?

Why should Managers and Executives be interested?

What’s
What s in the Fine Print?

How does this change Daily Life?

A New Leadership Role for Managers and Executives!

Robert Stoddard
© 2009 Carnegie Mellon University 4
What is CMMI High
Maturity?

Robert Stoddard
© 2009 Carnegie Mellon University
2009 ASQ QMD Conference

“Capability Maturity Model Integrated” by the SEI

“CMMI Second Edition - Guidelines for Process Integration


p
and Product Improvement” by
y Maryy Beth Chrissis,, Mike
Konrad, and Sandy Shrum
ISBN-10: 0321279670
ISBN 13 978
ISBN-13: 978-0321279675
0321279675

Technical Report
CMU/SEI-2006-TR-008
http://www.sei.cmu.edu/publications/documents/06.repo
rts/06tr008 html
rts/06tr008.html
Robert Stoddard
© 2009 Carnegie Mellon University 6
2009 ASQ QMD Conference

CMMI Organizational Maturity Levels

Level 1: Initial
Level 2: Managed
Level 3: Defined
Level 4: Quantitatively Managed
Level 5: Optimizing

Robert Stoddard
© 2009 Carnegie Mellon University 7
2009 ASQ QMD Conference

CMMI Organizational Maturity Levels

Level 1: Initial • Processes are ad hoc and


chaotic
Level 2: Managed • Success
S depends
d d on
individual heroics and
Level 3: Defined competence
• Usually exceed budgets
Level 4: Quantitatively Managed and over run schedule
commitments
Level 5: Optimizing • Tend to over-commit,
over commit,
abandon processes during
crisis, and have troubling
repeating success

Robert Stoddard
© 2009 Carnegie Mellon University 8
2009 ASQ QMD Conference

CMMI Organizational Maturity Levels

Level 1: Initial • Processes are planned and


executed according to policy
Level 2: Managed • Skilled
Skill d people
l with
ith
adequate resources are in
Level 3: Defined place
• Processes are maintained
Level 4: Quantitatively Managed during crisis
• Management and
Level 5: Optimizing customers have insight at
different milestones during
the process
• Work pproducts are
controlled

Robert Stoddard
© 2009 Carnegie Mellon University 9
2009 ASQ QMD Conference

CMMI Organizational Maturity Levels

Level 1: Initial • Work processes are well


defined, in terms of
Level 2: Managed standards procedures
standards, procedures, tools
and methods
Level 3: Defined • Organizational standard
processes are established
Level 4: Quantitatively Managed and used as assets by all
projects
Level 5: Optimizing • Projects
j become more
consistent with each other
although individual projects
will tailor the organizational
processes

Robert Stoddard
© 2009 Carnegie Mellon University 10
2009 ASQ QMD Conference

CMMI Organizational Maturity Levels


• Quantitative objectives
Level 1: Initial established for quality &
performance at the
Level 2: Managed organization and project
level
Level 3: Defined • Critical processes are
statistically
y managed
g
Level 4: Quantitatively Managed • Statistical management of
processes
Level 5: Optimizing • Process performance
models are used
• Process performance
models enable the prediction
off interim
i t i and d fi
finall project
j t
outcomes, thereby enabling
projects to make midstream
changes
Robert Stoddard
© 2009 Carnegie Mellon University 11
2009 ASQ QMD Conference

CMMI Organizational Maturity Levels


• Continual improvement
Level 1: Initial based on quantitative
understanding g of p
processes
Level 2: Managed
• Primarily aimed at reducing
Level 3: Defined common cause variation (e.g.
increasing process
Level 4: Quantitatively Managed capability)
• Formalized Corrective
Level 5: Optimizing Action, Innovative Changes,
anddDDeployment
l t off process
changes
• Increased agility via
dynamic process
adjustments driven by
objective data

Robert Stoddard
© 2009 Carnegie Mellon University 12
2009 ASQ QMD Conference

CMMI Process Areas


Process
P Process
P Process
P
Area Description Area Description Area Description
CAR Corrective Action OPP Organizational REQM Requirements
Process Management
g
Performance
CM Configuration OT Organizational RSKM Risk Management
Management Training

DAR Decision Analysis PI Product Integration SAM Supplier Agreement


and Resolution Management

IPM Integrated Project PMC Project Monitoring TS Technical Solution


Management and Control
MA Measurement and PP Project Planning VAL Validation
Analysis

OID Organizational PPQA Process and VER Verification


Innovation and Product Q
Qualityy
Deployment Assurance
OPD Organizational QPM Quantitative Project
Process Definition Management

OPF Organizational RD Requirements


Process Focus Development
Robert Stoddard
© 2009 Carnegie Mellon University 13
2009 ASQ QMD Conference

CMMI Process Areas by Maturity Level

Process
Category
g y Process Project
j
Maturity Level Management Management Engineering Support
Initial
(
(ML1) )
Managed PP, PMC, REQM CM, PPQA,
(ML2) SAM MA
Defined OPF, OPD,
OPF OPD IPM RSKM
IPM, RD, TS
RD TS, PI
PI, DAR
(ML3) OT VER, VAL
Quantitatively OPP QPM
Managed (ML4)
Optimized OID CAR
(ML5)

Robert Stoddard
© 2009 Carnegie Mellon University 14
2009 ASQ QMD Conference

Some CMMI High Maturity References - 1

OPP SP 1.5 Establish Process-Performance


Models
Establish and maintain the process-performance models for
the organization’s
g set of standard p
processes.

OPP SP = Organizational Process Performance Specific


Practice
Robert Stoddard
© 2009 Carnegie Mellon University 15
2009 ASQ QMD Conference

Some CMMI High Maturity References - 2


QPM SP 1.2 Compose the Defined Process
Select the sub-processes that compose the project’s defined
process based on historical stability and capability data.

QPM SP 1.4 Manage Project Performance


Monitor the project to determine whether the project’s
objectives for quality and process performance will be
satisfied, and identify corrective action as appropriate.

QPM SP = Quantitative Project Management Specific


Practice
Robert Stoddard
© 2009 Carnegie Mellon University 16
2009 ASQ QMD Conference

Some CMMI High Maturity References - 3

QPM SP 2.2 Apply Statistical Methods to


Understand Variation
Establish and maintain an understanding of the variation of
the selected sub-processes using the selected measures
and analytic techniques
techniques.

QPM SP 2.3 Monitor Performance of the Selected


Sub-processes
Monitor the performance of the selected subprocesses to
d t
determine
i ththeir
i capability
bilit tto satisfy
ti f th
their
i quality
lit and
d
process-performance objectives, and identify corrective
action as necessary.

Robert Stoddard
© 2009 Carnegie Mellon University 17
2009 ASQ QMD Conference

Some CMMI High Maturity References - 4

CAR SP 2.2 Evaluate the Effect of Changes


E l t th
Evaluate the effect
ff t off changes
h on process performance.
f

OID SP 1.2 Identifyy and Analyze


y Innovations
Identify and analyze innovative improvements that could
increase the organization’s quality and process
performance.
performance

CAR SP = Causal Analysis and Resolution Specific Practice


OID SP = Organizational Innovation and Deployment
Specific Practice
Robert Stoddard
© 2009 Carnegie Mellon University 18
2009 ASQ QMD Conference

Some CMMI High Maturity References - 5

OID SP 1.3 Pilot Improvements


p
Pilot process and technology improvements to select which
ones to implement.

OID SP 1.4 Select Improvements for Deployment


Select process and technology improvements for
deployment across the organization.

OID SP 2.3
2 3 Measure
M Improvement
I t Effects
Eff t
Measure the effects of the deployed process and technology
improvements.

Robert Stoddard
© 2009 Carnegie Mellon University 19
Why should Managers
and Executives be
Interested?

Robert Stoddard
© 2009 Carnegie Mellon University
2009 ASQ QMD Conference

Experiences without High Maturity Practices* -1

A multi-year, 50 person project failed to deliver a product to


market because the p performance of a first-time use of Java
was not modeled nor predicted!

A lack of modelingg the business case and p


predicting
g
business outcomes for an innovative technology product
resulted in a multi-billion dollar loss.

A marketing executive’s reassignment occurred primarily


due to the incorrect analysis of market data and trends over
a multi-year
multi year period which resulted in a seriously deficient
portfolio and an unrecoverable loss of world market share.

* Author’s first-hand or second-hand observations over the past 12 years


Robert Stoddard
© 2009 Carnegie Mellon University 21
2009 ASQ QMD Conference

Experiences without High Maturity Practices* -2


A history of products shipping late became so serious that
key customers began asking for the project durations so they
could multiply it by a factor of 2
2. Schedule uncertainty was
not modeled!
A lack of modeling requirements volatility over a 5 year
period, across 150 new product development efforts,
prevented an organization from discerning that late changes
were the primary culprit shipping late
late. Customers were finally
convinced of the significant detriment they were having on
product schedules after being shown modeling results.
M d li also
Modeling l motivated
i d shorter
h d
development
l schedules
h d l to
further reduce the chance of customer volatility!
* Author’s first-hand or second-hand observations over the past 12 years
Robert Stoddard
© 2009 Carnegie Mellon University 22
2009 ASQ QMD Conference

Experiences with High Maturity Practices* - 1

A four-fold reduction in the cost of engineering testing and


evaluation
l ti off a product
d t was made d possible
ibl b
by th
the modeling
d li
and prediction of both the volume and type of latent defects!

A ten-fold reduction in the normal development schedule of


a new, classified missile, was enabled primarily by the use
of design
g of experiments
p and p
performance modeling!g

Similar results as above for the development and calibration


of an innovative,
innovative high speed printer for several worldwide
printing chains!

* Author’s first-hand or second-hand observations over the past 12 years


Robert Stoddard
© 2009 Carnegie Mellon University 23
2009 ASQ QMD Conference

Experiences with High Maturity Practices* - 2

Modeling and predicting high risk software modules enabled


g
an organization to move from 100% code inspections
p down
to a sampling inspection of 10% without degradation of
quality.
Modeling enabled an organization to decide when it was
safe to use much less inspection methods such as the email
routing
g and review of artifacts.
Modeling enabled an organization to decide what the
acceptable levels of design and code complexity were
based on engineer experience and other factors. Enabled
proper staffing of projects and on-time deliveries.
* Author’s first-hand or second-hand observations over the past 12 years
Robert Stoddard
© 2009 Carnegie Mellon University 24
2009 ASQ QMD Conference

Experiences with High Maturity Practices* - 3

Repeated model updates of the business case, for new


product development projects within a management gate
review system, enabled an organization to stop infeasible
product efforts as soon as it was p
p predicted that the p
product
efforts were infeasible.

This prevented any re-occurences off an experience in which


an infeasible product effort progressed for two years,
without shipping,
g resulting
g in an ultimate loss of $50M.

* Author’s first-hand or second-hand observations over the past 12 years


Robert Stoddard
© 2009 Carnegie Mellon University 25
What’s in the Fine
Print?

Robert Stoddard
© 2009 Carnegie Mellon University
All Models (Qualitative and Quantitative)

Quantitative Models (Deterministic, Statistical, Probabilistic)

Statistical or Probabilistic Models


Anecdotal
Interim outcomes predicted Biased
samples
No
Controllable x factors involved uncertainty
or variation
Onlyy final modeled
Process Performance
QQual outcomes
Model - are
Only modeled
With controllable x uncontrollable
factors tied to factors are
Only phases
O modeled
Processes and/or or lifecycles
Sub-processes are modeled

Robert Stoddard
© 2009 Carnegie Mellon University
2009 ASQ QMD Conference

Healthy Ingredients of CMMI Process


Performance Models
Statistical, probabilistic or simulation in nature

Predict interim and/or final project outcomes

Use controllable factors tied to sub-processes to conduct the prediction


Model the variation of factors and understand the predicted range or
variation of the outcomes
Enable “what-if” analysis
y for p
project
j p
planning,
g, dynamic
y re-planning
p g and
problem resolution during project execution
Connect “upstream” activity with “downstream” activity
Enable projects to achieve mid-course corrections to ensure project
success

Robert Stoddard
© 2009 Carnegie Mellon University 28
2009 ASQ QMD Conference

Quantifying Relationships and Statistical Models


Y
Continuous Discrete
ANOVA
s Discrrete

Chi-Square,
q ,
&DDummy
Logit, Logistic
Variable
X g
Regression
R
Regression
i
Conttinuous

Simple, Linear
Logistic
& Non-Linear
Regression
Regression
Robert Stoddard
© 2009 Carnegie Mellon University 29
2009 ASQ QMD Conference

ANOVA & Dummy Variable Regression Models


Using these controllable factors… To predict this outcome!
Type of Reviews Conducted; Type of Design Delivered Defect Density
Method; Language Chosen; Types of Testing

High-Medium-Low Domain Experience; Productivity


Architecture Layer; Feature; Team; Lifecycle
model; Primary communication method
Estimation method employed; Estimator; Type of Cost and Schedule
Project; High-Medium-Low Staff Turnover; High- Variance
M di
Medium-Low
L C
Complexity;
l it C Customer;
t P
Product
d t
Team; Product; High-Medium-Low Maturity of Cycle Time or
Platform; Maturity or Capability Level of Process; Time-to-Market
Decision making level in organization; Release
Decision-making
Iterations on Req’ts; Yes/No Prototype; Method of Customer Satisfaction (as
Req’ts Elicitation; Yes/No Beta Test; Yes/No On- a percentile result)
Time; High
High-Medium-Low
Medium Low Customer Relationship

Robert Stoddard
© 2009 Carnegie Mellon University 30
2009 ASQ QMD Conference

Multiple Regression
Using these controllable factors… To predict this
outcome!
Req’ts
R ’t V
Volatility;
l tilit D Design
i and
dCCode
d CComplexity;
l it D li
Delivered
dDDefect
f tD Density
it
Test Coverage; Escaped Defect Rates
Staff Turnover %; Years of Domain Experience; Productivity
Employee Morale Survey %; Volume of
Interruptions or Task Switching
Availability of Test Equipment %; Req’ts Cost and Schedule
Volatility; Complexity; Staff Turnover Rates Variance
Individual task durations in hrs; Staff availability Cycle Time or
%; Percentage of specs undefined; Defect Time-to-Market
arrival rates during inspections or testing
Resolution time of customer inquiries; Customer Satisfaction
Resolution time of customer fixes; Percent of (as a percentile result)
eatu es delivered
features de e ed o on-time;
t e; Face
ace ttime
e pe
per week
ee

Robert Stoddard
© 2009 Carnegie Mellon University 31
2009 ASQ QMD Conference

Chi-Square & Logistic Regression


Using these controllable factors… To predict this outcome!
Programming Language; High-Medium-Low Types of Defects
Schedule compression; Req’ts method; Design
method; Coding method; Peer Review method
Predicted Types of Defects; High-Medium-Low Types of Testing Most
Schedule compression; Types of Features N d d
Needed
Implemented; Parts of Architecture Modified
Architecture Layers or components to be Types of Skills Needed
modified; Type of Product; Development
Environment chosen; Types of Features
Types of Customer engagements; Type of Results of Multiple Choice
Customer;; Product involved;; Culture;; Region
g Customer Surveys
Product; Lifecycle Model Chosen; High-Medium- Risk Categories of Highest
Low Schedule compression; Previous High Risk Concern
Categories

Robert Stoddard
© 2009 Carnegie Mellon University 32
2009 ASQ QMD Conference

Logistic Regression
Using these controllable factors… To predict this
outcome!
IInspection
ti Preparation
P ti Rates;
R t I
Inspection
ti Review
R i T
Types off Defects
D f t
Rates; Test Case Coverage %; Staff Turnover
Rates; Previous Escape Defect Rates
Escape Defect Rates; Predicted Defect Density Types of Testing Most
entering test; Available Test Staff Hours; Test Needed
Equipment or Test Software Availability
Defect Rates in the Field;; Defect rates in previous
p Types of Skills Needed
release or product; Turnover Rates; Complexity of
Issues Expected or Actual
Time (in Hours) spent with Customers; Defect Results of Multiple
p Choice
rates of products or releases; Response times Customer Surveys
Defect densities during inspections and test; Time Risk Categories of
to execute tasks normalized to work product size Highest
g Concern

Robert Stoddard
© 2009 Carnegie Mellon University 33
2009 ASQ QMD Conference

Other Modeling Approaches Used in Industry

• Probabilistic Decision Trees

• Monte Carlo Simulation

• Bayesian Belief Networks

• Discrete Event Process Simulation

• Markov Models

• Petri-Nets

• Neural Nets

• Systems Dynamics Models


Robert Stoddard
© 2009 Carnegie Mellon University 34
2009 ASQ QMD Conference

When and How Many Process Performance


Models Do We Need?
Software Coding Software Unit Testing
Software
Design Systems
Testing

Integration Testing
Requirements
Management
Requirements Customer
Elicitation A
Acceptance
t
Testing
Project
Forecasting

Project Project
Planning Start Project
Proposal Finish

Robert Stoddard
© 2009 Carnegie Mellon University 35
2009 ASQ QMD Conference

Process Performance Models View Processes


Holistically

Processes may be thought of holistically as a system that


includes the people, materials, energy, equipment, and
procedures necessary to produce a product or service.

People Material Energy Equipment Procedures

Requirements Products &


& Id
Ideas S i
Services
Work Activities
Time

Robert Stoddard
© 2009 Carnegie Mellon University 36
How does this change
Daily Life?

Robert Stoddard
© 2009 Carnegie Mellon University
2009 ASQ QMD Conference

Traditional Management Review Perspective

Management has come to realize that just looking at the


customary lagging outcomes is like driving a car using only the
rear-view mirror.

Robert Stoddard
© 2009 Carnegie Mellon University 38
2009 ASQ QMD Conference

High Maturity Management Review Perspective


Management dashboards in High Maturity organizations
include not only outcomes, but leading indicators
- such as the controllable x
factors used in process
performance models.

Thus, management begins


asking for an additional 33-5
5
leading indicators for each
traditional, lagging indicator.

Robert Stoddard
© 2009 Carnegie Mellon University 39
2009 ASQ QMD Conference

A Change in Senior Management Behavior

Lower maturity organizations spend 80% of their


g
management review meetingsg looking
g backwards - due to
their focus on what has occurred via lagging indicators
(e.g. current values of cost, schedule and quality)

High maturity organizations spend 80% of their time looking


forwards - due to their focus on what will happen by
examining leading indicators and model predictions.

Thus, managementt can effect


Th ff t course corrections
ti within
ithi
projects by adjusting controllable process factors.

Robert Stoddard
© 2009 Carnegie Mellon University 40
2009 ASQ QMD Conference

The dark blue


lines represent
potential uses of %

predictive Objectives Success 1 2 3 4 1 2 3 4

criteria
Reporting Periods

models!
(Lagging)
(Leading) Success Indicators
Strategy to
Analysis Indicators Have the objectives
accomplish
What are results of objectives
bj ti been achieved?
specific tasks? What is the impact of
100
80
60
the tactics?
40
20 Tasks to
Tasks
accomplish objectives
For project Roll-up for
Task 1
higher management
Test Cases

manager
Complete

Task 2
Task 3 Actual 100 Actual

Functions • 80
60
• Planned 40
20
Planned

Task n Reporting Periods Reporting Periods

(Lagging)
32 UCL = 31.6
28
Number of 24
Unresolved
CL = 20.04
Problem 20
Reports 16
12

Progress
g Indicators
8 LCL = 8.49
0 5 10 15 20 25 30

14 UCL = 14.2
12
10
Moving 8
R
Range 6

How well are plans proceeding?


4 CL = 4.35
2
0
0 5 10 15 20 25 30

Week of System Test

Robert Stoddard
© 2009 Carnegie Mellon University 41
2009 ASQ QMD Conference

One Comparison: Dashboards Measures Score

Resolution
l Time off Technical
h l
Measures Score Inquiries
Requirements Volatility

Cost Variance Staff Turnover

Average Domain Experience


Schedule Variance
of team

Milestones
l Complexity Values of the
Architecture
Instability of key interfaces
Cumulative Defect Density
from Inspections Code Coupling and Cohesion

Degree of Testable Requirements


Cumulative Defect Density
from Testing Stability of Test Environment

Having only these lagging Brittleness of Software


Indicators is less effective than…
Having these additional leading
Indicators!
Robert Stoddard
© 2009 Carnegie Mellon University 42
2009 ASQ QMD Conference

A Second Comparison: Progress Data


95% Confidence
Interval

ects
ects

Numberr of Defe
Numberr of Defe

Calendar Time Calendar Time

Traditional management review while management in High Maturity


would conclude that corrective action organizations understand that
is needed… corrective action is not needed!

Robert Stoddard
© 2009 Carnegie Mellon University 43
2009 ASQ QMD Conference
ed A Third Comparison: Customer Survey Data

ed
Percentt Satisfie

Percentt Satisfie
The only
statistically
significant shift!
Q1 Q1 Q2 Q2 Q3 Q3 Q1 Q1 Q2 Q2 Q3 Q3
FY08
08 FY09
09 FY08
08 FY09
09 FY08
08 FY09
09 FY08 FY09 FY08 FY09 FY08 FY09

Traditional analysis reacts to any while managers in High Maturity


perceived differences in average organizations understand that
percentage
p g results… onlyy statistically
y significant
g
differences matter!
Robert Stoddard
© 2009 Carnegie Mellon University 44
2009 ASQ QMD Conference

Predicting Uncertain Schedules with Confidence - 1

Process Durations
Step Expected
1 30
2 50
3 80
4 50
5 90
6 25
7 35
8 45 What would you
9 70 forecast the
10 25 schedule duration
to be?
500

Robert Stoddard
© 2009 Carnegie Mellon University 45
2009 ASQ QMD Conference

Predicting Uncertain Schedules with Confidence - 2

Process Durations
Step Best Expected Worst
1 27 30 75
2 45 50 125
3 72 80 200
4 45 50 125
5 81 90 225
6 23 25 63
7 32 35 88
8 41 45 113 Would you change
9 63 70 175 your mind in the
10 23 25 63 face of
500 unbalanced risk?

Robert Stoddard
© 2009 Carnegie Mellon University 46
2009 ASQ QMD Conference

Predicting Uncertain Schedules with Confidence - 3

Almost With 90%


guaranteed to confidence, we will
miss the 500 be under 817 days
days duration duration!
100% of the
time!

Robert Stoddard
© 2009 Carnegie Mellon University 47
2009 ASQ QMD Conference

An Example Software Brittleness PPM

The outcome, Y, is the measure of software brittleness,


measured on an arbitraryy scale of 0 ((low)) to 100 ((high),
g ),
which will be treated as continuous data

The x factors
Th f t used
d iin thi
this prediction
di ti example
l are th
the
following:
• Unit p
path complexity
p y
• Unit data complexity
• Number of times the unit code files have been changed
• Number of unit code changes not represented in Design document
updates

Robert Stoddard
© 2009 Carnegie Mellon University 48
2009 ASQ QMD Conference

The Brittleness PPM Equation Both Confidence Intervals


and Prediction Intervals
may be created for any
prediction using this
equation!
ti !

Robert Stoddard
© 2009 Carnegie Mellon University 49
2009 ASQ QMD Conference

An Example Pre-System Test PPM

The outcome, Y, is the relative likelihood of occurrence of


yp ((e.g.
the different standard defect types g nominal categories
g
such as: logical, data, and algorithmic)

The x factor
Th f t used d iin thi
this prediction
di ti example
l iis a measure off
staff turnover of the feature development team prior to
System Test

This x factor was chosen because it historically surfaced as


a significant
i ifi t ffactor
t ini explaining
l i i ttypes off d
defects
f t found
f d iin
System Test.

Robert Stoddard
© 2009 Carnegie Mellon University 50
2009 ASQ QMD Conference
The Pre-System Test PPM Prediction

Robert Stoddard
© 2009 Carnegie Mellon University 51
2009 ASQ QMD Conference

A Reliability Growth Model Example - 1

You can predict


future defect arrival
rates from historical
defect arrival rates.
These curves may
be fitted to
reliability growth
models.
These fitted curves
may then be
converted to
cumulative curves
such as on the next
slide!

Robert Stoddard
© 2009 Carnegie Mellon University 52
2009 ASQ QMD Conference

A Reliability Growth Model Example - 2

With this approach,


500
you can conclude
l d
120 defects
Defects

the remaining test


380 time required (88
days) to reach a
ulative D

quality goal,
and/or the latent
Cumu

defects to be
delivered to the
88 days customer (120
d f t ) assuming
defects) i
you delivered
Today today!

Robert Stoddard
© 2009 Carnegie Mellon University 53
2009 ASQ QMD Conference

An Example of Predicting Customer Satisfaction

Y = Customer Satisfaction Scores

Possible x factors that may be used in Multiple Regression to predict Y:


Attributes of Customer including power user vs casual user
Degree of “delighters” vs “satisfiers” vs “must
“must-be”
be” product features
Timeliness in reaching the market window
Price
Time for competitors to catch up
Economy
Product return policy
p y
Customer service record
Ability for customers to get help and provide feedback

Robert Stoddard
© 2009 Carnegie Mellon University 54
2009 ASQ QMD Conference

An Example of Predicting Recruitment

Y = Probability of Hiring a Critical Resource

Possible x factors that may be used in Multiple Regression to predict Y:


Availability of Critical Expertise in the local area
Salary willing to offer candidates
Other benefits including signing bonus
Career path available to new hires
Amount of professional development provided to employees
Retirement package
Profit sharinggppackageg
Vacation available to new employees
Mobility within the organization
Degree of agile teaming employed vs bureaucracy of organization
Robert Stoddard
© 2009 Carnegie Mellon University 55
2009 ASQ QMD Conference

An Example of Predicting Retention of Critical


Resources
Y = Probability of Retaining a Critical Resource

Possible x factors that may be used in Multiple Regression to predict Y:


Salary increases available to employees
Career path available to employees
Amount of professional development provided to employees
Retirement package
Profit sharing package
Vacation available to new employees
Mobilityy within the organization
g
Degree of agile teaming employed vs bureaucracy of organization
Employee attitude survey results
Degree of conflict and politics in the organization
Robert Stoddard
© 2009 Carnegie Mellon University 56
2009 ASQ QMD Conference

One Caveat To Remember!

“All
All Models are Wrong
Wrong, Some are Useful!”
Useful!

By George Box

Robert Stoddard
© 2009 Carnegie Mellon University 57
2009 ASQ QMD Conference

A Second Caveat to Remember!

“I will never sacrifice reality for elegance


without explaining why I have done so so. Nor
will I give the people who use my model false
comfort about its accuracy.
accuracy Instead
Instead, I will
make explicit its assumptions and
oversights ”
oversights.

Excerpt
ce p from
o BusinessWeek
us ess ee a article
c e titled
ed “Perfect
e ec Models,
ode s,
Imperfect World”, January 12, 2009, by Emanuel Derman
and Paul Wilmott, which discusses the financial models
misused behind the mortgage meltdown
meltdown.
Robert Stoddard
© 2009 Carnegie Mellon University 58
A New Leadership Role
for Managers and
Executives!

Robert Stoddard
© 2009 Carnegie Mellon University
2009 ASQ QMD Conference

High Maturity Challenges to Leaders!


Become educated consumers of statistical and modeling
output!
D
Demand d lleading
di iindicators
di iin addition
ddi i to llagging
i iindicators!
di !
Lead by example with critical thinking and statistical thinking!
Demand root cause thinking instead of treating symptoms!
When presented with data, immediately ask what was
statistically expected! Sort out signal from noise!
Demand prediction models within each discipline and
business function! Let there be no surprises!
D
Demand d iinvolvement
l tbby allll stakeholders
t k h ld and
db balanced
l d
models that avoid sub-optimization!
g
Recognize that variation exists in everything!
y g
Robert Stoddard
© 2009 Carnegie Mellon University 60
2009 ASQ QMD Conference

Q
Questions?
ti ?
Robert Stoddard
© 2009 Carnegie Mellon University 61
2009 ASQ QMD Conference

Contact Information

Robert W. Stoddard U.S. mail:


Email: rws@sei.cmu.edu Software Engineering Institute
Customer Relations
4500 Fifth Avenue
Pittsburgh PA 15213-2612
Pittsburgh,
USA
World Wide Web: Customer Relations
www.sei.cmu.edu Email: customer-
For additional presentations on relations@sei.cmu.edu
CMMI Highg Maturityy Telephone:
p +1 412-268-5800
www.sei.cmu.edu/ SEI Phone: +1 412-268-5800
sema/presentations.html SEI Fax: +1 412-268-6257

Robert Stoddard
© 2009 Carnegie Mellon University 62

Das könnte Ihnen auch gefallen