Beruflich Dokumente
Kultur Dokumente
Outline
Source Selection Process With SCE Site Visit SS Timeline and SCEs SS Decision Process SS Documentation Affected Summary of Activities Prior to Site Visit SCE Process Use of SCE Results in Source Selection Summary
Other factors
Cost evaluation
Management evaluation
Other factors
Proposals Offerors
SSP/Evaluation RFP Plan release 45-90 days Bring together trained SCE team
45-60 days
SSEB determines Brief Award technical SSAC/ rating/risk SSA 3-10 days
SCE Method
Site Visit 3 - 5 days per offeror
Outline
Source Selection Process With SCE Site Visit SCE Process SCE Method Description Team Process and Ground Rules Sample Documents Interviews 3 Day Agenda Details Use of SCE Results in Source Selection Summary
Reference: Software Capability Evaluation, Version 3.0 Method Description (CMU/SEI-96-TR-002) April 1996
Data Consolidation
Insufficient
S C E Te am P ro ce ss an d G ro u nd R u le s
In terv iew s
No attrib utio n o f in fo rm a tio n o b ta in e d to a n in d iv id u a l o r a s pe c ific p ro je c t T e a m m ay in te rv ie w an in d iv id u a l m o re tha n on c e Int e rv ie w s c he d u le w ill b e co m e d yn a m ic a fte r f irs t d a y A ll c h a ng e s to in te r vie w s ch e d u le m a de th ro u gh site vis it c o o rd in a to r
D oc um e nt R eview s
T he te a m w ill lo o k fo r o bje c tiv e ev id en c e (or la ck o f o b je c tiv e e v id e nc e ) to su b s ta n tiate w ha t it h e a rs in in ter v ie w s All do c u m e n ta tio n re q ue s ts c oo rd ina te d th ro u gh s ite c oo rd ina t or All do c u m e n ta tio n re tu rn e d a t th e e n d o f site vis it; the te a m w ill n o t m ak e a ny c op ie s of t he do c u m e n ts
F in din gs
S a m p le D oc um en ts
D o c u m en ts re fe re n c e d in o ffe ro rs q u e stio n n a ire m ad e a v aila b le at s ta rt o f S C E s ite vis it
Project Documents:
Program Management Plan Software Development Plan Software Configuration Management Plan Software Quality Assurance Plan Software Test Procedures Software Standards and Procedures Manual Sample Software Development Folder Software Policy, Standards, and Procedures Generic Software Development Plan Quality Assurance Plan Configuration Management Plan
Division Documents:
Day 2:
Documentation review Exploratory interviews (e.g., Documentation review SCM, SQA)
Day 3:
Interviews Report writing
SCE Implementation
Model basis Uses the SW-CMM, a globally accepted model, as the basis for the evaluation Documentation and Training Is based on documented method: Method Description Document, Implementation Guide Uses defined and controlled training course for all evaluators and Lead Evaluators Process Evaluates the organizations capability based on the same geographical site, business unit, and program characteristics of the acquisition under review Assigns a trained team in advance of visit to script questions and prepare for on-site period Requires site visit (length depends on SW-CMM Scope) Based on a confidential interview session (the entire team versus one or many interviewees) and in-depth document review with 'rules of evidence' Requires consensus among a trained software evaluation team (generally 4- 6 people) Focuses on a standard set of KPAs for each Maturity Level evaluated Uses a defined algorithm for rolling up results and creating ratings (e.g., sufficiency of evidence, alternative practices, institutionalization of common features, rolls up key practices into rating the goals )
Outline
Source Selection Process With SCE Site Visit SCE Process Use of SCE Results in Source Selection KPA Findings Summary of KPAs by Offeror Example Technical Item Example Technical Area Summary
KPA _(Level 2-3) Level 2 Configuration Management Requirements Management Quality Assurance Project Tracking and Oversight Project Planning Subcontract Management Level 3: Organization Process Definition Organization Process Focus Peer Reviews Intergroup Coordination Software Product Engineering Integrated Software Management Training Program
Contractor A + + + + + + + +
Contractor B + + + + + + + + + + + + +
Ex. A L4 L3 L2 L1
Risk Rating (achievability of improvement plan vs SCE results) High : likely to cause disruption of schedule, increase cost, degrade performance Medium: potential to cause Low: little potential to cause ...
strong
Offeror B - Yellow
None
Offeror C - Yellow
Organization Process Focus
Requirements Management Peer Reviews Software Project Tracking and Oversight Organization Process Focus
R I S K
Offeror is very strong technically and is committed to developing quality software using a continuously improving development process.
Because of the large disparity between our findings and their submitted SPIP, it is highly questionable whether the software process improvement is being implemented.
Offeror has a realistic SPIP indicating they are at the initial maturity level with their best practices being applied to all new programs.
Management Technical app. Software Engineering Capability Red/Orange test Ada test
G G B P P
G G Y P P
B B Y P P
G B Y P F
G G G F P
B B B P P
Summary rating
Y
L L
Y
M
Y
H
Outline
Source Selection Process With SCE Site Visit SCE Process Use of SCE Results in Source Selection Summary Gansler Policy and Implementation Questions SCE Process
Reference: Software Capability Evaluation, Version 3.0 Method Description (CMU/SEI-96-TR-002) April 1996
Backup Slides
Project ,.s
A=
B=
C=
D=
KPA
Questionnaire Responses
Comments
RM
Requirements Management
RM.1
Are system requirements allocated to software used to establish a baseline for software engineering and management use?
RM.2
As the systems requirements allocated to S/W change, are the necessary adjustments to S/W plans, work products, and activities made?
RM.3
Does the project follow a written organizational policy for managing the
Rating Scale
CMM component common feature* goal Key Process Area key practice* maturity level*
* - optional
Rating scale (each component) satisfied, unsatisfied not applicable not rated
Rating Principles
Rating of satisfied is required: if reference model practices are implemented if alternate practices are implemented which achieve the intent of the goals Rating of not applicable required if practices do not apply in an organizations environment. Rating of unsatisfied cant be given unless valid weaknesses that significantly impact goals are identified. Rating cant be given unless planned model coverage is attained.
Rating Prerequisites
Observations are sufficient validated covered corroborated Valid observations are: accurate agreed to by the team (consensus)