Sie sind auf Seite 1von 33

Overview of Software Capability Evaluations in Source Selection

Charlie Ryan, SEI March 30, 2000

Outline
Source Selection Process With SCE Site Visit SS Timeline and SCEs SS Decision Process SS Documentation Affected Summary of Activities Prior to Site Visit SCE Process Use of SCE Results in Source Selection Summary

Source Selection Decision Process


Award decision Source Selection Authority (SSA) Risk assessment Source Selection Advisory Council (SSAC) Data evaluation and risk identification Source Selection Evaluation Board (SSEB) Findings Findings Findings Findings Findings

Other factors

Cost evaluation

Data collection SCE team

Management evaluation

Other factors

Proposals Offerors

Source Selection Documentation Affected by SCE


Prior to SCE Site Visit
Commerce Business Daily Announcement Source Selection Plan (SSP) Evaluation Plan Bidders Briefing Request for Proposal (RFP) Possibly the Statement of Work or Award Fee Plan

After SCE Site Visit


Briefing for winning offeror Briefing for losing offerors

For Samples Refer to SCE Version 3 Implementation Guide CMU/SEI-95-TR-012

SCE in a Typical SS Timeline


Decision to use SCE Receive proposals
- 6-8 candidate projects - Project Profiles - Project questionnaires

SSP/Evaluation RFP Plan release 45-90 days Bring together trained SCE team

45-60 days

SSEB determines Brief Award technical SSAC/ rating/risk SSA 3-10 days

Pre Site Visit

SCE Method
Site Visit 3 - 5 days per offeror

Post Site Visit


Findings Report 2-3 days

Select sites, projects, topics, coordinate visit 1-2 days

Summary of SCE Activities Prior to SCE Site Visits


Source selection organizations must: Select qualified SCE team and relationship to SSEB Define how SCE results will be used - Technical Area, Technical Item, Risk - Determine relative weight of SCE and scoring criteria Provide funds for SCE - Personnel, training, and travel Schedule time for SCE activities - Pre site visit, site visits, final report Insert SCE requirement in source selection documents - SOW, RFP, SSP - Project profile, Maturity Questionnaire, Improvement Plan Establish plan to accommodate reuse of SCE results

Outline
Source Selection Process With SCE Site Visit SCE Process SCE Method Description Team Process and Ground Rules Sample Documents Interviews 3 Day Agenda Details Use of SCE Results in Source Selection Summary

SCE Method Description


- Plan and Prepare for Evaluation Phase - Conduct Evaluation Phase - Report Evaluation Results Phase

Reference: Software Capability Evaluation, Version 3.0 Method Description (CMU/SEI-96-TR-002) April 1996

General Appraisal Activities


Plan Appraisal Prepare for Appraisal Collect Data

Data Consolidation

Organize and Combine Data

Determine Data Sufficiency Sufficient

Insufficient

Review and Revise Data Gathering Plans

Make Rating Judgments Report Results Follow Up Actions

S C E Te am P ro ce ss an d G ro u nd R u le s
In terv iew s
No attrib utio n o f in fo rm a tio n o b ta in e d to a n in d iv id u a l o r a s pe c ific p ro je c t T e a m m ay in te rv ie w an in d iv id u a l m o re tha n on c e Int e rv ie w s c he d u le w ill b e co m e d yn a m ic a fte r f irs t d a y A ll c h a ng e s to in te r vie w s ch e d u le m a de th ro u gh site vis it c o o rd in a to r

D oc um e nt R eview s

T he te a m w ill lo o k fo r o bje c tiv e ev id en c e (or la ck o f o b je c tiv e e v id e nc e ) to su b s ta n tiate w ha t it h e a rs in in ter v ie w s All do c u m e n ta tio n re q ue s ts c oo rd ina te d th ro u gh s ite c oo rd ina t or All do c u m e n ta tio n re tu rn e d a t th e e n d o f site vis it; the te a m w ill n o t m ak e a ny c op ie s of t he do c u m e n ts

F in din gs

T ea m m u s t ob s e rv e s u p p or tin g e v ide n c e fro m tw o o r m o re in d e pe n d e nt s o u rc e s F in d in g s g e ne ra te d th ro u gh co n s e n su s pr oc e s s

S a m p le D oc um en ts
D o c u m en ts re fe re n c e d in o ffe ro rs q u e stio n n a ire m ad e a v aila b le at s ta rt o f S C E s ite vis it

Project Documents:

Program Management Plan Software Development Plan Software Configuration Management Plan Software Quality Assurance Plan Software Test Procedures Software Standards and Procedures Manual Sample Software Development Folder Software Policy, Standards, and Procedures Generic Software Development Plan Quality Assurance Plan Configuration Management Plan

Division Documents:

A d d itio n a l d o c u m en ts re q u e sted o n -s ite

In ter view C an did ates & L en gth of ea ch In terv ie w Example:


Position Length of Interviewed Project Managers S/W Managers Manager of QA Project QA Manager of CM Project CM SEPG System Engineer Test Engineer Project Subcontract Manager Developer Interview (hrs) 0.5 1.25 0.5 0.75 0.5 0.75 1.25 0.50 0.50 0.75 0.75 Number 2 3 1 1 1 1 1 1 1 1 2

3-Day Agenda Day 1


government team Introductory briefing by Overview briefing by offeror Interview senior managers

Day 2:
Documentation review Exploratory interviews (e.g., Documentation review SCM, SQA)

Day 3:
Interviews Report writing

SCE Implementation
Model basis Uses the SW-CMM, a globally accepted model, as the basis for the evaluation Documentation and Training Is based on documented method: Method Description Document, Implementation Guide Uses defined and controlled training course for all evaluators and Lead Evaluators Process Evaluates the organizations capability based on the same geographical site, business unit, and program characteristics of the acquisition under review Assigns a trained team in advance of visit to script questions and prepare for on-site period Requires site visit (length depends on SW-CMM Scope) Based on a confidential interview session (the entire team versus one or many interviewees) and in-depth document review with 'rules of evidence' Requires consensus among a trained software evaluation team (generally 4- 6 people) Focuses on a standard set of KPAs for each Maturity Level evaluated Uses a defined algorithm for rolling up results and creating ratings (e.g., sufficiency of evidence, alternative practices, institutionalization of common features, rolls up key practices into rating the goals )

SCE Implementation Issues


Questionnaire Use of Maturity Questionnaire was modified to be more useful to SCE Team Difficult to staff 5 member team with trained/experienced folks Difficult to evaluate multiple sites due to time and stress on team SCE team may or may not be SSEB members Acquisition Process SCEs without discussions Contractor process typically not placed on contract One site visit may not be sufficient if offeror establishes multicontractor team SCE requires modification to evaluate specific software technologies SCE Method No current plan to evolve the SCE Method Initiative to reuse SCE results is not well known/used

CMM Maturity Questionnaire


What it is a quick look at processes standardized an instrument to start a document review index and possibly scope down site evaluation What it is not a test an input for findings a replacement for CMM training

The Rating Process


- Team consensus - All findings verified via at least 2 sources - Team determines if key practices are satisfied/unsatisfied - Team determines whether goals are satisfied/unsatisfied -- Based on rating Key practices/ Alternative Practices - Un/satisfied goals roll up to KPA - Un/satisfied KPAs roll up to maturity level

Outline
Source Selection Process With SCE Site Visit SCE Process Use of SCE Results in Source Selection KPA Findings Summary of KPAs by Offeror Example Technical Item Example Technical Area Summary

Sample KPA Findings Peer Reviews: Satisfactory


Strengths Organizational level policies, procedures, and standards exist for conducting Peer Reviews Training is provided for conducting Peer Reviews Peer Reviews have been conducted on all documents generated to-date SDP, OCD, SCM plan, SQA Plan Requirements Baseline, Design, Test Plan and Procedures Identified changes are tracked to completion (e.g., document red-lines) Weaknesses Design and code review process undefined in the software development plan Lack of formal data on the conduct and results of Peer Reviews (e.g., review time, types of defect, product size)

CRs and DRs are written against weaknesses

S u m m ary o f S C E R es ults b y K ey P r oc ess A rea s

KPA _(Level 2-3) Level 2 Configuration Management Requirements Management Quality Assurance Project Tracking and Oversight Project Planning Subcontract Management Level 3: Organization Process Definition Organization Process Focus Peer Reviews Intergroup Coordination Software Product Engineering Integrated Software Management Training Program

Contractor A + + + + + + + +

Contractor B + + + + + + + + + + + + +

+ M e e ts th e g oa ls of th e pr ed e fin e d s tan da r d - D o e s n ot m e e t th e g o als * N o t e v a lua ted s in c e n o h is to ry of s ub c on tr a ctin g s oftw ar e d e v elo pm en t re s po ns ibilities

Example Use of SCE as Technical Item


Selection criteria or areas Technical Cost Prior performance Technical items Management Technical approach Software Engineering Capability (e.g., SCE) SCE findings converted to a color rating with an associated high / moderate / low risk

Color Rating and Risk Rating


Color Rating Blue Green Yellow Red
(maturity level)

Exceptional Acceptable Marginal Unacceptable

Ex. A L4 L3 L2 L1

Ex. B 11-13 KPAs 8-10 6-7 <6

Risk Rating (achievability of improvement plan vs SCE results) High : likely to cause disruption of schedule, increase cost, degrade performance Medium: potential to cause Low: little potential to cause ...

Translating Findings Into Risk


Key Process Current Practic Area e weak Software Project Tracking and Oversight Proposed Practice strong Correctability improvement not current practice; no efforts to improve indicated high: project C; new company policy Risk high

Software Quality Assurance

weak and strong

strong

low, requires monitoring

Example of Color and Risk Assessment


Item: T-3 Software Engineering Capability
Offeror A - Blue
S T R E N G T H W E A K N E S S

Offeror B - Yellow
None

Offeror C - Yellow
Organization Process Focus

Requirements Management Peer Reviews Software Project Tracking and Oversight Organization Process Focus

Software Quality Assurance Training Program

Peer Reviews Software Project Tracking and Oversight Training Program

R I S K

Offeror is very strong technically and is committed to developing quality software using a continuously improving development process.

Because of the large disparity between our findings and their submitted SPIP, it is highly questionable whether the software process improvement is being implemented.

Offeror has a realistic SPIP indicating they are at the initial maturity level with their best practices being applied to all new programs.

Example of Area Summary


Area: Technical
Offeror D Offeror E Offeror F Offeror G Offeror H Offeror I

Management Technical app. Software Engineering Capability Red/Orange test Ada test

G G B P P

G G Y P P

B B Y P P

G B Y P F

G G G F P

B B B P P

Summary rating

Y
L L

Y
M

Y
H

Outline
Source Selection Process With SCE Site Visit SCE Process Use of SCE Results in Source Selection Summary Gansler Policy and Implementation Questions SCE Process

Ganslers Policy and Implementation Questions


Software evaluations for ACAT I Programs

it will be a technical requirement for contract that each contractor performing


software development or upgrades for use in ACAT I program will undergo an evaluation - Question: Will there be guidance for use of evaluation results as an Area, Item, or Factor, w/wo Risk; or left up to each acquisition? - Question: Will there be guidance for use of CBA-IPIs, third party SCEs, or must all acquisitions have government lead SCE? - Question: Will there be guidance for reuse of SCE results? At minimum, full compliance with Level 3 or its equivalent level in an approved evaluation tool - Question: Will there be guidance defining Blue, Green, Yellow, Red and other rating approaches (e.g., Level 3 = green, 10<x<13 KPAs = yellow)? If the prospective contractor does not meet full compliance, a risk mitigation plan and schedule must be prepared that will describe in detail actions that will be taken to remove deficiencies uncovered in the evaluation process and must be provided to the Program Manager for approval. - Quesiton: Will there be guidance for use of risk mitigation plan in source selection (e.g., risk of proposed improvement plan high if not realistic)? - Question: Guidance to put mitigation plan on contract and in award fee?

SCE Method Summary


- Plan and Prepare for Evaluation Phase - Conduct Evaluation Phase - Report Evaluation Results Phase

Reference: Software Capability Evaluation, Version 3.0 Method Description (CMU/SEI-96-TR-002) April 1996

Backup Slides

Project ,.s

A=

B=

C=

D=

KPA

Questionnaire Responses

Subprocess Areas (Question)

Comments

RM

Requirements Management

RM.1

Are system requirements allocated to software used to establish a baseline for software engineering and management use?

RM.2

As the systems requirements allocated to S/W change, are the necessary adjustments to S/W plans, work products, and activities made?

RM.3

Does the project follow a written organizational policy for managing the

Rating Scale
CMM component common feature* goal Key Process Area key practice* maturity level*
* - optional

Rating scale (each component) satisfied, unsatisfied not applicable not rated

level 1 through level 5

Rating Principles
Rating of satisfied is required: if reference model practices are implemented if alternate practices are implemented which achieve the intent of the goals Rating of not applicable required if practices do not apply in an organizations environment. Rating of unsatisfied cant be given unless valid weaknesses that significantly impact goals are identified. Rating cant be given unless planned model coverage is attained.

Rating Prerequisites
Observations are sufficient validated covered corroborated Valid observations are: accurate agreed to by the team (consensus)

Das könnte Ihnen auch gefallen