Sie sind auf Seite 1von 74

USC

C S E

University of Southern California

Center for Software Engineering

The COCOMO II Suite of Software


Cost Estimation Models
Barry Boehm, USC
COCOMO/SSCM Forum 21 Tutorial
November 8, 2006
boehm@csse.usc.edu,
http://csse.usc.edu/research/cocomosuite
11/8/06

USC-CSSE

USC

C S E

University of Southern California

Center for Software Engineering

Thanks to USC-CSSE Affiliates (33)


Commercial Industry (10)
Cost Xpert Group, Galorath, Group Systems, IBM, Intelligent Systems, Microsoft,
Motorola, Price Systems, Softstar Systems, Sun

Aerospace Industry (8)


BAE Systems, Boeing, General Dynamics, Lockheed Martin, Northrop
Grumman(2), Raytheon, SAIC

Government (6)
FAA, NASA-Ames, NSF, US Army Research Labs, US Army TACOM, USAF
Cost Center

FFRDCs and Consortia (6)


Aerospace, FC-MD, IDA, JPL, SEI, SPC

International (3)
Institute of Software, Chinese Academy of Sciences, EASE (Japan), Samsung

11/8/06

USC-CSSE

USC

C S E

University of Southern California

Center for Software Engineering

USC-CSSE Affiliates Program


Provides priorities for, access to USC-CSE research
Scalable spiral processes, cost/schedule/quality models,
requirements groupware, architecting and re-engineering
tools, value-based software engineering methods.
Experience in application in DoD, NASA, industry
Affiliate community events
14th Annual Research Review and Executive Workshop, USC
Campus, February 12-15, 2007
11th Annual Ground Systems Architecture Workshop (with
Aerospace Corp.), Manhattan Beach, CA, March 26-29, 2007
22nd International COCOMO/Systems and Software Cost
Estimation Forum, USC Campus, October 23-26, 2007

Synergetic with USC distance education programs


MS Systems Architecting and Engineering
MS / Computer Science Software Engineering
11/8/06

USC-CSSE

USC

University of Southern California

C S E

Center for Software Engineering

Outline

11/8/06

COCOMO II Overview

Overview of Emerging Extensions

Motivation and Context


Model Form and Parameters
Calibration and Accuracy

COTS Integration (COCOTS)


Quality: Delivered Defect Density (COQUALMO)
Phase Distributions (COPSEMO)
Rapid Application Development Schedule (CORADMO)
Productivity Improvement (COPROMO)
Product Line Investment (COPLIMO)
System Engineering (COSYSMO)
System of System Integration (COSOSIMO)
COCOMO II Security Extensions (COSECMO)
Network Information Protection (CONIPMO)

USC-CSSE

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO Baseline Overview I


Software product size estimate
Software development mainten
Software product, process, computer, and personal attributes
Cost, schedule, distribution by
Software reuse, maintenance,
and increment parameters
phase, activity, increment
Software organizations
Project data
COCOMO recalibrated to

ance cost and schedule estimates

COCOMO

11/8/06

organizations data

USC-CSSE

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO II Book Table of Contents

- Boehm, Abts, Brown, Chulani, Clark, Horowitz, Madachy, Reifer, Steece, Software
Cost Estimation with COCOMO II, Prentice Hall, 2000

1. Introduction
2. Model Definition
3. Application Examples
4. Calibration
5. Emerging Extensions
6. Future Trends
Appendices

Assumptions, Data Forms, Users Manual, CD Content

CD: Video tutorials, USC COCOMO II.2000, commercial tool


demos, manuals, data forms, web site links, Affiliate forms
11/8/06

USC-CSSE

USC

C S E

University of Southern California

Center for Software Engineering

Need to ReEngineer COCOMO 81

11/8/06

New software processes


New sizing phenomena
New reuse phenomena
Need to make decisions based on
incomplete information

USC-CSSE

USC

University of Southern California

C S E

Center for Software Engineering

4x

COCOMO II Model Stages

2x

Early Design
(13 parameters)

1.5x
1.25x

Relative
Size Range

x
0.8x

Post-Architecture
(23 parameters)

0.67x
0.5x

Applications
Composition
(3 parameters)

0.25x

Concept of
Operation
Feasibility

Plans
and
Rqts.

Detail
Design
Spec.

Product
Design
Spec.

Rqts.
Spec.
Product
Design

Detail
Design

Accepted
Software
Devel.
and Test

Phases and Milestones


11/8/06

USC-CSSE

USC

University of Southern California

C S E

Center for Software Engineering

Outline

11/8/06

COCOMO II Overview

Overview of Emerging Extensions

Motivation and Context


Model Form and Parameters
Calibration and Accuracy

COTS Integration (COCOTS)


Quality: Delivered Defect Density (COQUALMO)
Phase Distributions (COPSEMO)
Rapid Application Development Schedule (CORADMO)
Productivity Improvement (COPROMO)
Product Line Investment (COPLIMO)
System Engineering (COSYSMO)
System of System Integration (COSOSIMO)
COCOMO II Security Extensions (COSECMO)
Network Information Protection (CONIPMO)

USC-CSSE

USC

C S E

University of Southern California

Center for Software Engineering

Early Design and Post-Architecture Model


Environment

Multipliers

Effort

Process Scale Factors

Size

Environment: Product, Platform, People,


Project Factors
Size: Nonlinear reuse and volatility effects
Process: Constraint, Risk/Architecture, Team,
Maturity Factors

Schedule Multiplier Effort


11/8/06

USC-CSSE

Process Scale Factors

10

USC

C S E

University of Southern California

Center for Software Engineering

New Scaling Exponent Approach


Nominal person-months = A*(size)B
B = 0.91 + 0.01 (scale factor ratings)
- B ranges from 0.91 to 1.23
- 5 scale factors; 6 rating levels each

Scale factors:

- Precedentedness (PREC)
- Development flexibility (FLEX)
- Architecture/ risk resolution (RESL)
- Team cohesion (TEAM)
- Process maturity (PMAT, derived from SEI CMM)

11/8/06

USC-CSSE

11

USC

University of Southern California

C S E

Center for Software Engineering

Project Scale Factors

2.94 (Size) B EM
PM
estimated
i

B 0.910.01
.
SF
i
Scale Factors
(Wi)

Low

Nominal

FLEX

thoroughly
unprecedented
rigorous

largely
unprecedented
occasional
relaxation

RESL

little (20%)

some (40%)

somewhat
unprecedented
some
relaxation
often (60%)

TEAM

very difficult
interactions

some difficult
interactions

PREC

PMAT

11/8/06

Very Low

basically
cooperative
interactions
weighted sum of 18 KPA achievement levels

USC-CSSE

High
generally
familiar
general
conformity
generally
(75%)
largely
cooperative

Very High

Extra High

largely familiar throughly


familiar
some
general goals
conformity
mostly (90%) full (100%)
highly
cooperative

seamless
interactions

12

USC

University of Southern California

C S E

Center for Software Engineering

Nonlinear Reuse Effects


Data on 2954
NASA modules
[Selby, 1988]

1.0
0.75

Relative
cost 0.5

1.0

0.70
0.55

Usual Linear
Assumption

0.25
0.046
0.25

0.5

0.75

1.0

Amount Modified
11/8/06

USC-CSSE

13

USC

C S E

University of Southern California

Center for Software Engineering

Reuse and Reengineering Effects


Add Assessment & Assimilation increment (AA)
- Similar to conversion planning increment

Add software understanding increment (SU)

- To cover nonlinear software understanding effects


- Coupled with software unfamiliarity level (UNFM)
- Apply only if reused software is modified

Results in revised Equivalent Source Lines of Code


(ESLOC)
- AAF = 0.4(DM) + 0.3 (CM) + 0.3 (IM)
- ESLOC = ASLOC[AA+AAF(1+0.02(SU)(UNFM))],
AAF < 0.5
- ESLOC = ASLOC[AA+AAF(SU)(UNFM))], AAF > 0.5

11/8/06

USC-CSSE

14

USC

University of Southern California

C S E

Center for Software Engineering

Software Understanding
Rating / Increment
Very Low
Structure

Low

Reasonably
well structured;
some weak
areas.

High

Very High

Strong
modularity,
information
hiding in
data/control
structures.
Application
No match
Some
Moderate
Good
Clear match
Clarity
between
correlation
correlation
correlation
between
program and
between
between
between
program and
application
program and
program and
program and
application
world views.
application .
application .
application .
world views.
SelfObscure code;
Some code
Moderate level
Good code
SelfDescriptiveness documentation commentary and
of code
commentary
descriptive
missing,
headers; some commentary,
and headers;
code;
obscure or
useful
headers,
useful
documentation
obsolete.
documentation. documentation. documentation; up-to-date,
some weak
well-organized,
areas.
with design
rationale.
SU Increment to
50
40
30
20
10
ESLOC

11/8/06

Very low
Moderately low
cohesion, high cohesion, high
coupling,
coupling.
spaghetti code.

Nom

USC-CSSE

High cohesion,
low coupling.

15

USC

C S E

University of Southern California

Center for Software Engineering

Other Major COCOMO II Changes


Range versus point estimates
Requirements Volatility (Evolution) included in Size
Multiplicative cost driver changes
- Product CDs
- Platform CDs
- Personnel CDs
- Project CDs
Maintenance model includes SU, UNFM factors from reuse
model
Applied to subset of legacy code undergoing change
11/8/06

USC-CSSE

16

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO II Estimation Accuracy:


Percentage of sample projects within 30% of actuals
-Without and with calibration to data source

# Projects
Effort
Schedule

11/8/06

COCOMO81

COCOMOII.1997

COCOMOII.2000

63
81%

83

161

52%
64%
61%
62%

75%
80%
72%
81%

65%

USC-CSSE

17

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO II. 2000 Productivity Ranges


Scale Factor Ranges: 10, 100, 1000 KSLOC

Development Flexibility (FLEX)


Team Cohesion (TEAM)
Develop for Reuse (RUSE)
Precedentedness (PREC)
Architecture and Risk Resolution (RESL)
Platform Experience (PEXP)
Data Base Size (DATA)
Required Development Schedule (SCED)
Language and Tools Experience (LTEX)
Process Maturity (PMAT)
Storage Constraint (STOR)
Use of Software Tools (TOOL)
Platform Volatility (PVOL)
Applications Experience (AEXP)
Multi-Site Development (SITE)
Documentation Match to Life Cycle Needs (DOCU)
Required Software Reliability (RELY)
Personnel Continuity (PCON)
Time Constraint (TIME)
Programmer Capability (PCAP)
Analyst Capability (ACAP)
Product Complexity (CPLX)
1

1.2

1.4

1.6

1.8

2.2

2.4

Productivity Range

11/8/06

USC-CSSE

18

USC

University of Southern California

C S E

Center for Software Engineering

Outline

11/8/06

COCOMO II Overview

Overview of Emerging Extensions

Motivation and Context


Model Form and Parameters
Calibration and Accuracy

COTS Integration (COCOTS)


Quality: Delivered Defect Density (COQUALMO)
Phase Distributions (COPSEMO)
Rapid Application Development Schedule (CORADMO)
Productivity Improvement (COPROMO)
Product Line Investment (COPLIMO)
System Engineering (COSYSMO)
System of System Integration (COSOSIMO)
COCOMO II Security Extensions (COSECMO)
Network Information Protection (CONIPMO)

USC-CSSE

19

USC

C S E

University of Southern California

Center for Software Engineering

Status of Models

COCOMO II
COCOTS
COQUALMO
Defects in
Defects out
CORADMO
COSYSMO

11/8/06

Literature

Behavior

Signif.
Variables

Delphi

*
*

*
*

*
*

*
*

>200

*
*
*
*

*
*
*
*

*
*
*
*

*
*

USC-CSSE

Data,
Bayes

20

6
16
60

20

USC

C S E

University of Southern California

Center for Software Engineering

STAFFING

COCOMO vs. COCOTS Cost Sources

TIME
11/8/06

USC-CSSE

21

USC

C S E

University of Southern California

Center for Software Engineering

Current COQUALMO System


COCOMO II
Software Size Estimate
Software platform,
Project, product and
personnel attributes

COQUALMO

Defect
Introduction
Model

Software
development effort,
cost and schedule
estimate

Number of residual defects


Defect density per unit of size

Defect removal
profile levels
Automation,
Reviews, Testing

11/8/06

Defect
Removal
Model

USC-CSSE

22

USC

C S E

University of Southern California

Center for Software Engineering

Defect Removal Rating Scales


COCOMO II p.263

11/8/06

Very Low

Low

Nominal

High

Very High

Extra High

Automated
Analysis

Simple
compiler
syntax
checking

Basic compiler
capabilities

Compiler
extension
Basic req. and
design
consistency

Intermediatelevel module
Simple
req./design

More elaborate
req./design
Basic distprocessing

Formalized
specification,
verification.
Advanced distprocessing

Peer Reviews

No peer review

Ad-hoc
informal walkthrough

Well-defined
preparation,
review,
minimal
follow-up

Formal review
roles and Welltrained people
and basic
checklist

Root cause
analysis,
formal follow
Using
historical data

Extensive
review
checklist
Statistical
control

Execution
Testing and
Tools

No testing

Ad-hoc test
and debug

Basic test
Test criteria
based on
checklist

Well-defined
test seq. and
basic test
coverage tool
system

More advance
test tools,
preparation.
Distmonitoring

Highly
advanced
tools, modelbased test

USC-CSSE

23

USC

C S E

University of Southern California

Center for Software Engineering

Defect Removal Estimates


- Nominal Defect Introduction Rates

70
60

60

50

Delivered Defects
/ KSLOC

40
30

28.5

20
14.3

10

7.5

0
VL

Low

Nom

High

3.5
VH

1.6
XH

Composite Defect Removal Rating

11/8/06

USC-CSSE

24

USC

University of Southern California

C S E

Center for Software Engineering

Outline

11/8/06

COCOMO II Overview

Overview of Emerging Extensions

Motivation and Context


Model Form and Parameters
Calibration and Accuracy

COTS Integration (COCOTS)


Quality: Delivered Defect Density (COQUALMO)
Phase Distributions (COPSEMO)
Rapid Application Development Schedule (CORADMO)
Productivity Improvement (COPROMO)
Product Line Investment (COPLIMO)
System Engineering (COSYSMO)
System of System Integration (COSOSIMO)
COCOMO II Security Extensions (COSECMO)
Network Information Protection (CONIPMO)

USC-CSSE

25

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO II RAD Extension


(CORADMO)

RVHL
DPRS
CLAB

COCOMO II
cost drivers
(except SCED)
Language
Level,
experience,...

11/8/06

COCOMO II

Baseline
effort,
schedule

Phase
Distributions
(COPSEMO)

Effort,

RESL
PPOS

RCAP

RAD
Extension

schedule
by stage

RAD effort,
schedule
by phase
USC-CSSE

26

USC

C S E

University of Southern California

Center for Software Engineering

Effect of RCAP on Cost, Schedule


16
14
12

10
8

RCAP = XL

6
4
RCAP = XH

2
0
0

10

20

30

40

50

PM
3.7*(Cube root)

11/8/06

3*(Cube root)

USC-CSSE

Square root

27

USC

C S E

University of Southern California

Center for Software Engineering

COPROMO (Productivity) Model


Uses COCOMO II model and extensions as
assessment framework
Well-calibrated to 161 projects for effort, schedule
Subset of 106 1990s projects for current-practice baseline
Extensions for Rapid Application Development formulated

Determines impact of technology investments on


model parameter settings
Uses these in models to assess impact of
technology investments on cost and schedule
Effort used as a proxy for cost

11/8/06

USC-CSSE

28

USC

C S E

University of Southern California

Center for Software Engineering

The COPLIMO Model

Constructive Product Line Investment Model


Based on COCOMO II software cost model

Statistically calibrated to 161 projects, representing 18 diverse


organizations

Based on standard software reuse economic terms


RCR: Relative cost of reuse
RCWR: Relative cost of writing for reuse

Avoids overestimation

Avoids RCWR for non-reused components


Adds life cycle cost savings

Provides experience-based default parameter values


Simple Excel spreadsheet model
Easy to modify, extend, interoperate

11/8/06

USC-CSSE

29

USC

C S E

University of Southern California

Center for Software Engineering

COPLIMO Estimation Summary


Part I: Product Line Development Cost Estimation Summary:
0

0
0
0
0

294
444
-150
-1.00

588
589
-1
-0.01

882
735
147
0.98

1176
881
295
1.97

1470
1026
444
2.96

Product Line Development Cost Estimation


600
Net development
effort savings

# of Products
Effort (PM)
No Reuse
Product Line
Product Line Savings
ROI

400
200
0
-200

# of products in product line

Product Line Annualized Life Cycle Cost


Estim ation

Net Product Line Effort Savings

Part II: Product Line Annualized Life Cycle Cost Estimation Summary:
# of Products 0
1
2
3
4
5
AMSIZE-P 0
8.1
16.2
24.2
32.3
40.4
AMSIZE-R 0
6.1
6.1
6.1
6.1
6.1
AMSIZE-A 0
6.1
7.7
9.3
11.0
12.6
Total Equiv. KSLOC 0
20.2
29.9
39.6
49.3
59.1
Effort (AM) (*2.94) 0
59.4
88.0
116.5
145.1
173.7
5-year Life Cycle PM 0
296.9
439.8
582.6
725.4
868.3
PM(N, 5)-R (+444) 0
740.9
883.7 1026.5 1169.4 1312.2
PM(N, 5)-NR 0
590.9 1181.9 1772.8 2363.8 2954.7
Product Line Savings (PM) 0
-149.9 298.2
746.3 1194.4 1642.5
ROI 0
-1.00
1.99
4.98
7.97
10.96
Devel. ROI 0
-1.00
-0.01
0.98
1.97
2.96
3-year Life Cycle 0
-142.0 120.0
480.0

800
700

5-year Life Cycle

600
500
400

3-year Life Cycle

300
200

Development

100
0
-100 0

-200
# of products

AMSIZE: Annually Maintained Software Size

11/8/06

USC-CSSE

30

USC

University of Southern California

C S E

Center for Software Engineering

Outline

11/8/06

COCOMO II Overview

Overview of Emerging Extensions

Motivation and Context


Model Form and Parameters
Calibration and Accuracy

COTS Integration (COCOTS)


Quality: Delivered Defect Density (COQUALMO)
Phase Distributions (COPSEMO)
Rapid Application Development Schedule (CORADMO)
Productivity Improvement (COPROMO)
Product Line Investment (COPLIMO)
System Engineering (COSYSMO)
System of System Integration (COSOSIMO)
COCOMO II Security Extensions (COSECMO)
Network Information Protection (CONIPMO)

USC-CSSE

31

USC

C S E

University of Southern California

Center for Software Engineering

Model Differences

COCOMO II

Software
Development phases
20+ years old
200+ calibration points
23 Drivers
Variable granularity
3 anchor points
Size is driven by SLOC

11/8/06

USC-CSSE

COSYSMO

Systems Engineering
Entire Life Cycle
3 years old
60 calibration points
18 drivers
Fixed granularity
No anchor points
Size is driven by
requirements, I/F, etc
32

USC

C S E

University of Southern California

Center for Software Engineering

COSYSMO Operational Concept


# Requirements
# Interfaces
# Scenarios
# Algorithms
+
Volatility Factor

Size
Drivers
Effort
Multipliers

- Application factors
-8 factors
- Team factors
-6 factors
- Schedule driver

11/8/06

COSYSMO

Effort

Calibration

USC-CSSE

WBS guided by
ISO/IEC 15288

33

USC

C S E

University of Southern California

Center for Software Engineering

Size Drivers

COSOSIMO Operational Concept

Interface-related eKSLOC
Number of logical
interfaces at SoS level
Number of operational
scenarios
Number of components

Exponential Scale Factors

11/8/06

Integration simplicity
Integration risk resolution
Integration stability
Component readiness
Integration capability
Integration processes

COSOSIMO

SoS
Definition and
Integration
Effort

Calibration

USC-CSSE

34

USC

C S E

Security Impact on Engineering Effort

University of Southern California

Center for Software Engineering

For software developers:

For systems engineers:


Effort to develop
system increases

Source lines of code


increases
Effort to generate
software increases
Security functional
requirements
Security assurance
requirements

Effort to transition also


increases
More documentation
Additional certification and
accreditation costs
Being addressed by COSECMO
11/8/06

USC-CSSE

Network defense
requirements
Network defense
operational concepts
Program protection
requirements
Anti-tamper
implementation

Effort to transition also


increases
DITSCAP and red
teaming
Being addressed by CONIPMO

35

USC

C S E

University of Southern California

Center for Software Engineering

Backup Charts

11/8/06

USC-CSSE

36

USC

C S E

University of Southern California

Center for Software Engineering

Purpose of COCOMO II
To help people reason about the
cost and schedule implications of
their software decisions

11/8/06

USC-CSSE

37

USC

C S E

University of Southern California

Center for Software Engineering

Major Decision Situations


Helped by COCOMO II

Software investment decisions

When to develop, reuse, or purchase


What legacy software to modify or phase out

11/8/06

Setting project budgets and schedules


Negotiating cost/schedule/performance tradeoffs
Making software risk management decisions
Making software improvement decisions
Reuse, tools, process maturity, outsourcing

USC-CSSE

38

USC

C S E

University of Southern California

Center for Software Engineering

Relations to MBASE*/Rational
Anchor Point Milestones
App.
Compos.

Inception
LCO,
LCA

Sys
Devel

Waterfall
Rqts.
Inception
Phase

IOC

SRR

LCO

Transition

Elaboration, Construction

SAT

PDR
Prod. Des.

Development

Elaboration

Construction
LCA

Trans.
IOC

*MBASE: Model-Based (System) Architecting and Software Engineering


11/8/06

USC-CSSE

39

USC

C S E

University of Southern California

Center for Software Engineering

Post-Architecture EMs-Product:
Required
Reliability
(RELY)

Very Low

Low

Nominal

High

slight
inconvenience
(0.82)

Low, easily
recoverable
losses (0.92)

Moderate,
easily
recoverable
losses (1.00)
10<D/P<100

High financial
loss (1.10)

Risk to
human life
(1.26)

100<D/P<

D/P>1000

DB
bytes/Pgm
SLOC<10

Database Size
(DATA)
Complexity
(CPLX)

Required
Reuse (RUSE)
Documentation
Match to
Lifecycle
(DOCU)

11/8/06

_
None

Many lifecycle
needs
uncovered

Some
lifecycle
needs
uncovered

Very High

Extra High

1000
See
Complexity
Table
Across
project

Across
program

Across
product line

Right-sized
to lifecycle
needs

Excessive for
lifecycle
needs

Very
excessive for
lifecycle
needs

USC-CSSE

_
Across
multiple
product lines

40

USC

C S E

University of Southern California

Center for Software Engineering

Post-Architecture Complexity
Control Operations

Computation al
Operations

Very Low

Low

Nominal

High

Mostly simple nesting.


Some intermodule
control. Decision
tables. Simple
callbacks or message
passing, including
middleware-supported
distributed

processing.

Devicedependent
Operations

Data
Management
Operations

User Interface
Management
Operations

I/O processing
includes
device
selection,
status
checking and
error
processing.

Multi-file input
and single file
output. Simple
structural
changes, simple
edits. Complex
COTS-DB
queries, updates.

Use of standard
math and
statistical
routines. Basic
matrix/vector
operations.

Simple use of
widget set.

Very High

Extra
High

11/8/06

USC-CSSE

41

USC

C S E

University of Southern California

Center for Software Engineering

Post-Architecture EMs-Platform:
Very Low

Nominal

High

Very High

Extra High

Execution
Time
Constraint
(TIME)

< 50% use of


available
execution
time

70%

85%

95%

Main Storage
Constraint
(STOR)

< 50% use of


available
storage

70%

85%

95%q

Major: 6 mo.;
minor: 2 wk.

Major: 2 mo.;
minor: 1 wk.

Major: 2 wk.;
minor: 2 days

Platform
Volatility
(PVOL)

11/8/06

Low

Major change
every 12 mo.;
minor change
every 1 mo.

USC-CSSE

42

USC

C S E

University of Southern California

Center for Software Engineering

Post-Architecture EMs-Personnel:
Analyst
Capability
(ACAP)
Programmer
Capability
(PCAP)
Personnel
Continuity
(PCON)
Application
Experience
(AEXP)
Platform
Experience
(PEXP)
Language
and Tool
Experience
(LTEX)

11/8/06

Very Low

Low

Nominal

High

Very High

15th
percentile

35th
percentile

55th
percentile

75th
percentile

90th
percentile

15th
percentile

35th
percentile

55th
percentile

75th
percentile

90th
percentile

48%/year

24%/year

12%/year

6%/year

3%/year

<2 months

6 months

1 year

3 years

6 years

<2 months

6 months

1 year

3 years

6 years

<2 months

6 months

1 year

3 years

6 years

USC-CSSE

Extra High

43

USC

C S E

University of Southern California

Center for Software Engineering

Post-Architecture EMs-Project:
Very Low
Use of Software
Tools (TOOL)

Low

Nominal

High

Very High
Strong, mature,
proactive
lifecycle tools,
well integrated
with processes,
methods, reuse
Same building
or complex

Edit, code,
debug

Simple,
frontend,
backend
CASE, little
integration

Basic
lifecycle
tools,
moderately
integrated

Strong, mature
lifecycle tools,
moderately
integrated

International

Multi-city
and Multicompany

Multi-city or
Multicompany

Same city or
metro. Area

Multisite
Development:
Communications
(SITE)

Some phone,
mail

Individual
phone, FAX

Narrowband
email

Wideband
electronic
communication

Required
Development
Schedule
(SCED)

75% of
nominal

85%

100%

130%

Multisite
Development:
Collocation

Extra High

Fully
collocated

(SITE)

11/8/06

USC-CSSE

Wideband
elect. Comm,
occasional
video conf.

Interactive
multimedia

160%

44

USC

C S E

University of Southern California

Center for Software Engineering

Other Model Refinements

Initial Schedule Estimation

)
0.280.2(B
0.91

SCED %
TDEV 3.67 PM

100

where PM estimated person months excluding Schedule


multiplier effects

Output Ranges
Stage
Application Composition
Early Design
Post-Architecture

Optimistic Estimate
0.50 E
0.67 E
0.80 E

Pessimistic Estimate
2.0 E
1.5 E
1.25 E

- 80% confidence limits: 10% of time each below


Optimistic, above Pessimistic
- Reflect sources of uncertainty in model inputs
11/8/06

USC-CSSE

45

USC

C S E

University of Southern California

Center for Software Engineering

Early Design vs. Post-Arch EMs:


Early Design Cost Driver

Counterpart Combined Post


Architecture Cost Drivers
Product Reliability and Complexity RELY, DATA, CPLX, DOCU
Required Reuse

11/8/06

RUSE

Platform Difficulty

TIME, STOR, PVOL

Personnel Capability

ACAP, PCAP, PCON

Personnel Experience

AEXP, PEXP, LTEX

Facilities

TOOL, SITE

Schedule

SCED

USC-CSSE

46

USC

C S E

University of Southern California

Center for Software Engineering

Outline

COCOMO II Overview

Overview of Emerging Extensions

11/8/06

Motivation and Context


Model Form and Parameters
Calibration and Accuracy

COTS Integration (COCOTS)


Quality: Delivered Defect Density (COQUALMO)
Phase Distributions (COPSEMO)
Rapid Application Development Schedule (CORADMO)
Productivity Improvement (COPROMO)
Product Line Investment (COPLIMO)
System Engineering (COSYSMO)
System of System Integration (COSOSIMO)
Dependability ROI (iDAVE)

USC-CSSE

47

USC

C S E

University of Southern California

Center for Software Engineering

USC-CSE Modeling Methodology


- concurrency and feedback implied
Analyze existing literature
Step 1
Perform Behavioral
analyses
Step 2

Identify relative
significance
Step 3

Perform expert-judgment
Delphi assessment,
formulate
a-priori model
Step 4
Gather project data
Step 5

Determine Bayesian APosteriori model


Step 6

Gather more data; refine


model
Step 7

11/8/06

USC-CSSE

48

USC

C S E

University of Southern California

Center for Software Engineering

Results of Bayesian Update: Using Prior and


Sampling Information (Step 6)
A-posteriori Bayesian update

1.06

1.41

1.51

1.45
A-priori
Experts Delphi

Literature,
behavioral analysis

Noisy data analysis

11/8/06

Productivity Range =
Highest Rating /
Lowest Rating

Language and Tool Experience (LTEX)


USC-CSSE

49

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO Model Comparisons


COCOMO
Size

Reuse

Delivered Source Instructions


(DSI) or Source Lines of
Code (SLOC)
Equivalent SLOC = Linear
(DM,CM,IM)

Ada COCOMO

COCOMO II:
Application Composition

COCOMO II:
Early Design

COCOMO II:
Post-Architecture

DSI or SLOC

Application Points

Function Points (FP) and


Language or SLOC

FP and Language or SLOC

Equivalent SLOC = Linear


(DM,CM,IM)

Implicit in Model

Equivalent SLOC = nonlinear


(AA, SU,UNFM,DM,CM,IM)

Equivalent SLOC = nonlinear


(AA, SU,UNFM,DM,CM,IM)

RVOL rating

Implicit in Model

Change % : RQEV

RQEV

ACT

Object Point ACT

(ACT,SU,UNFM)

(ACT,SU,UNFM)

Rqts. Change

Requirements Volatility
rating: (RVOL)

Maintenance

Annual Change Traffic


(ACT) =
%added + %modified

Scale (b) in
MMNOM=a(Size)b

Organic: 1.05 Semidetached:


1.12 Embedded: 1.20

Embedded: 1.04 -1.24


depending on degree of:
early risk elimination
solid architecture
stable requirements
Ada process maturity

1.0

Product Cost Drivers

RELY, DATA, CPLX

None

Platform Cost Drivers

TIME, STOR, VIRT, TURN

None

Platform difficulty: PDIF *

.91-1.23 depending on the


degree of:
precedentedness
conformity
early architecture, risk
resolution
team cohesion
process maturity (SEI)
RELY, DATA, DOCU * ,
CPLX , RUSE*
TIME, STOR, PVOL(=VIRT)

Personnel Cost
Drivers
Project Cost Drivers

ACAP, AEXP, PCAP,


VEXP, LEXP
MODP, TOOL, SCED

RELY*, DATA, CPLX *,


RUSE
TIME, STOR, VMVH,
VMVT, TURN
ACAP*, AEXP, PCAP *,
VEXP, LEXP *
MODP*, TOOL*, SCED,
SECU

.91-1.23 depending on the


degree of:
precedentedness
conformity
early architecture, risk
resolution
team cohesion
process maturity (SEI)
RCPX* , RUSE*

None

Personnel capability and


experience: PERS*, PREX*
SCED, FCIL*

ACAP*, AEXP, PCAP*,


PEXP*, LTEX *, PCON*
TOOL* , SCED, SITE*

None

* Different Multipliers
Different Rating Scale

11/8/06

USC-CSSE

50

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO II Experience Factory: I


Rescope
No

System objectives:
fcny, perf., quality
COCOMO 2.0

Cost,
Sched,
Risks

Corporate parameters:
tools, processes, reuse

11/8/06

USC-CSSE

Ok?

Yes

51

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO II Experience Factory: II


Rescope
No

System objectives:
fcny, perf., quality
COCOMO 2.0
Corporate parameters:
tools, processes, reuse

Cost,
Sched,
Risks

Ok?

Yes

Execute
project
to next
Milestone
M/S
Results

Milestone plans,
resources
Milestone
expectations

Revise
Milestones,
Plans,
Resources

No

Ok?
Yes

Done?

Revised
Expectations

No

Yes

End

11/8/06

USC-CSSE

52

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO II Experience Factory: III


Rescope
No

System objectives:
fcny, perf., quality
COCOMO 2.0
Corporate parameters:
tools, processes, reuse

Cost,
Sched,
Risks

Ok?

Yes

Execute
project
to next
Milestone
M/S
Results

Milestone plans,
resources
Milestone
expectations

Recalibrate
COCOMO 2.0

Accumulate
COCOMO 2.0
calibration
data

Revise
Milestones,
Plans,
Resources

No

Ok?
Yes

Done?

Revised
Expectations

No

Yes

End

11/8/06

USC-CSSE

53

USC

C S E

University of Southern California

Center for Software Engineering

COCOMO II Experience Factory: IV


Rescope
No

System objectives:
fcny, perf., quality
COCOMO 2.0
Corporate parameters:
tools, processes, reuse
Improved
Corporate
Parameters

Ok?

Yes

M/S
Results

Milestone plans,
resources

Cost, Sched,
Quality drivers

Evaluate
Corporate
SW
Improvement
Strategies

Cost,
Sched,
Risks

Execute
project
to next
Milestone

Recalibrate
COCOMO 2.0

Milestone
expectations

Accumulate
COCOMO 2.0
calibration
data

Revise
Milestones,
Plans,
Resources

No

Ok?
Yes

Done?

Revised
Expectations

No

Yes

End

11/8/06

USC-CSSE

54

USC

C S E

University of Southern California

Center for Software Engineering

New Glue Code Submodel Results


Current calibration looking reasonably good
Excluding projects with very large,
very small amounts of glue code (Effort Pred):
[0.5 - 100 KLOC]: Pred (.30) = 9/17 = 53%
[2 - 100 KLOC]: Pred (.30) = 8/13 = 62%
For comparison, calibration results shown at ARR 2000:
[0.1 - 390 KLOC]: Pred (.30) = 4/13 = 31%
Propose to revisit large, small, anomalous projects
A few follow-up questions on categories of code & effort
Glue code vs. application code
Glue code effort vs. other sources
11/8/06

USC-CSSE

55

USC

C S E

University of Southern California

Center for Software Engineering

Current Insights into Maintenance Phase Issues


Priority of Activities by Effort Involved and/or Criticality
Higher
training
SC
configuration management
C
operations support
C
integration analysis
S
requirements management
SC
S - spikes around
Medium
refresh cycle
certification
anchor points
S
market watch
C C - continuous
distribution
S
vendor management
C
business case evaluation
S
Lower
administering COTS licenses
C
11/8/06

USC-CSSE

56

USC

C S E

University of Southern California

Center for Software Engineering

RAD Context
RAD a critical competitive strategy
Market window; pace of change

Non-RAD COCOMO II overestimates


RAD schedules

Need opportunity-tree cost-schedule


adjustment
Cube root model inappropriate for small
RAD projects
COCOMO II:

11/8/06

Mo. = 3.7 PM

USC-CSSE

57

USC

C S E

University of Southern California

Center for Software Engineering

Eliminating Tasks

Reducing Time Per Task

RAD

Reducing Risks of Single-Point


Failures

RAD Opportunity Tree


Development process reengineering - DPRS
Reusing assets - RVHL
Applications generation - RVHL
Schedule as Independent Variable Process
Tools and automation - O
Work streamlining (80-20) - O
Increasing parallelism - RESL
Reducing failures - RESL
Reducing their effects - RESL
Early error elimination - RESL

Reducing Backtracking

Activity Network Streamlining

Process anchor points - RESL


Improving process maturity - O
Collaboration technology - CLAB
Minimizing task dependencies - DPRS
Avoiding high fan-in, fan-out - DPRS
Reducing task variance - DPRS
Removing tasks from critical path - DPRS

Increasing Effective Workweek

24x7 development - PPOS


Nightly builds, testing - PPOS
Weekend warriors - PPOS

Better People and Incentives

RAD Capability and experience - RCAP

11/8/06

O: covered by

USC-CSSE

58

USC

C S E

University of Southern California

Center for Software Engineering

RCAP:RAD Capability of Personnel

RATING
FACTOR

XL

VL

VH

XH

PERS-R

10%

25%

40%

55%

70%

85%

95%

PREX-R

2mo

4 mo

6 mo

1 yr

3 yrs

6 yrs

10 yrs

I,E, C
Multipliers
PM

1.20

1.13

1.06

1.0

.93

.86

.80

1.40

1.25

1.12

1.0

.82

.68

.56

P=PM/M

.86

.90

.95

1.0

1.13

1.26

1.43

PERS-R is the Early Design Capability rating, adjusted to reflect the performers
capability to rapidly assimilate new concepts and material, and to rapidly adapt to
change.

11/8/06

PREX-R is the Early Design Personnel Experience rating, adjusted to reflect the
performers experience with RAD languages, tools, components, and COTS
integration.
USC-CSSE

59

USC

C S E

University of Southern California

Center for Software Engineering

RCAP Example

RCAP = Nominal PM = 25, M = 5, P = 5


The square root law: 5 people for 5 months: 25 PM
RCAP = XH
PM = 20, M = 2.8, P = 7.1
A very good team can put on 7 people and finish in 2.8 months: 20 PM

RCAP = XL
PM = 30, M = 7, P = 4.3
Trying to do RAD with an unqualified team makes them less efficient (30 PM) and
gets the schedule closer to the cube root law:
(but not quite:

11/8/06

= 9.3 months > 7 months)

USC-CSSE

60

USC

C S E

University of Southern California

Center for Software Engineering

COPLIMO Inputs and Outputs


As functions of # products,
# years in life cycle

For current set of


similar products,
Average product size,
COCOMO II cost drivers
Percent mission-unique,
reused-with-mods,

Non-product line effort

COPLIMO

black-box reuse
RCR, RCWR factors,

Product line investment,


effort

Product line savings, ROI

annual change traffic


11/8/06

USC-CSSE

61

USC

C S E

University of Southern California

Center for Software Engineering

4 Size Drivers
1.
2.
3.
4.

Number of System Requirements


Number of Major Interfaces
Number of Operational Scenarios
Number of Critical Algorithms

Each weighted by complexity, volatility, and degree of reuse

11/8/06

USC-CSSE

62

USC

C S E

University of Southern California

Center for Software Engineering

Number of System Requirements


This driver represents the number of requirements for the system-of-interest at a
specific level of design. Requirements may be functional, performance, feature,
or service-oriented in nature depending on the methodology used for
specification. They may also be defined by the customer or contractor. System
requirements can typically be quantified by counting the number of applicable
shalls or wills in the system or marketing specification. Do not include a
requirements expansion ratio only provide a count for the requirements of the
system-of-interest as defined by the system or marketing specification.
Easy

11/8/06

Nominal

Difficult

- Well specified

- Loosely specified

- Poorly specified

- Traceable to source

- Can be traced to source with some


effort

- Hard to trace to source

- Simple to understand

- Takes some effort to understand

- Hard to understand

- Little requirements overlap

- Some overlap

- High degree of requirements


overlap

- Familiar

- Generally familiar

- Unfamiliar

- Good understanding of
whats needed to satisfy and
verify requirements

- General understanding of whats


needed to satisfy and verify
requirements

- Poor understanding of whats


needed to satisfy and verify
requirements

USC-CSSE

63

USC

C S E

University of Southern California

Center for Software Engineering

14 Cost Drivers
Application Factors (8)
1.
2.
3.
4.
5.
6.
7.
8.

11/8/06

Requirements understanding
Architecture complexity
Level of service requirements
Migration complexity
Technology Maturity
Documentation Match to Life Cycle Needs
# and Diversity of Installations/Platforms
# of Recursive Levels in the Design
USC-CSSE

64

USC

C S E

University of Southern California

Center for Software Engineering

Level of service (KPP) requirements


This cost driver rates the difficulty and criticality of satisfying the ensemble of Key
Performance Parameters (KPP), such as security, safety, response time,
interoperability, maintainability, the ilities, etc.
Viewpoint

Very low

Low

Nominal

High

Very High

Difficulty

Simple

Low difficulty,
coupling

Moderately
complex, coupled

Difficult, coupled
KPPs

Very complex,
tightly coupled

Criticality

Slight inconvenience

Easily recoverable
losses

Some loss

High financial
loss

Risk to human life

11/8/06

USC-CSSE

65

USC

C S E

University of Southern California

Center for Software Engineering

14 Cost Drivers (cont.)

Team Factors (6)


1.
2.
3.
4.
5.
6.

11/8/06

Stakeholder team cohesion


Personnel/team capability
Personnel experience/continuity
Process maturity
Multisite coordination
Tool support

USC-CSSE

66

USC

C S E

University of Southern California

Center for Software Engineering

4. Rate Cost Drivers - Application

11/8/06

USC-CSSE

67

USC

C S E

University of Southern California

Center for Software Engineering

9. *Time Phase the Estimate Overall Staffing

11/8/06

USC-CSSE

68

USC

C S E

University of Southern California

Center for Software Engineering

Proposed COSOSIMO Size Drivers

Subsystem size of interface software measured in effective KSLOC (eKSLOC)


Number of components
Number of major interfaces
Number of operational scenarios

S2
S1
S4
S3

11/8/06

USC-CSSE

Each weighted by
Complexity
Volatility
Degree of COTS/reuse

69

USC

C S E

University of Southern California

Center for Software Engineering

Proposed COSOSIMO Scale Factors


Integration risk resolution
Risk identification and mitigation efforts
Integration simplicity
Architecture and performance issues
Integration stability
How much change is expected
Component readiness
How much prior testing has been conducted on the components
Integration capability
People factor
Integration processes
Maturity level of processes and integration lab

11/8/06

USC-CSSE

70

USC

C S E

University of Southern California

Center for Software Engineering

Reasoning about the Value of


Dependability iDAVE
iDAVE: Information Dependability Attribute Value
Estimator
Use iDAVE model to estimate and track software
dependability ROI
Help determine how much dependability is enough

Help analyze and select the most cost-effective combination of


software dependability techniques
Use estimates as a basis for tracking performance

11/8/06

USC-CSSE

71

USC

C S E

University of Southern California

Center for Software Engineering

iDAVE Model Framework


Time-phased
information
processing
capabilities

Cost estimating relationships (CERs)


Cost = f

IP Capabilities (size),
project attributes

Cost
Dependability
attribute levels Di

Project attributes

Time-phased
dependability
investments

Time-phased

Dependability attribute estimating


relationships (DERs)
Di = gi

Value components
Vj

Dependability
investments,
project attributes

Return on
Investment
Value estimating relationships (VERs)
Vj = hj

11/8/06

IP Capabilities
dependability levels Di

USC-CSSE

72

USC

C S E

University of Southern California

Center for Software Engineering

Value ($)

Typical Value Estimating Relationships


Full Value

Production Function Shape


Linear

Revenue loss per hour system downtime


Intel: $275K
Cisco: $167K
Dell: $83K
Amazon.com: $27K
E*Trade: $8K
ebay: $3K

Investment

High-Returns

Diminishing Returns

1.0

Availability
11/8/06

USC-CSSE

73

USC

C S E

University of Southern California

Center for Software Engineering

ROI Analysis Results Comparison

11/8/06

USC-CSSE

74

Das könnte Ihnen auch gefallen