Beruflich Dokumente
Kultur Dokumente
Page 2 of 60
SIGNATURE PAGE
This page shows signatures of applicable team members indicating that the content of this
document has been carefully reviewed and concurred.
_________________________________
Jagati Vempaty
Lead Requirements Analyst
__________________
Date
_________________________________
Harish C. Sreenivas
Lead Architect/Designer
__________________
Date
_________________________________
Adam Scott
Lead Developer
__________________
Date
_________________________________
Seth Zaah
Lead QA / Tester
__________________
Date
_________________________________
Tom Q. Le
Project Manager
__________________
Date
_________________________________
Frank Tsui
Project Advisor
__________________
Date
Page 3 of 60
Page 4 of 60
REVISION HISTORY
The following table lists a history of changes made to this document. When a change is made,
the following attributes are updated: Version, date, A/M/D, description, and CRN. The version
indicates major and minor changes; 9 minor changes make up 1 major. The A/M/D indicates if
the change is a modification, addition, or deletion. The CRN is the change request number, if
applicable. The description briefly describes the changes.
Version
Date
A/M/D
Description
1.0
08-Sep-2010
A
Initial Draft
1.1
20-Sep-2010 A,M Baseline
1.2
27-Sep-2010
M
Baseline updated after peer review
Page 5 of 60
CRN
N/A
N/A
N/A
Page 6 of 60
PREFACE
The purpose of this Software Project Management Plan (SPMP) is to provide guidance on the
management of the Test Management Tool (TMT) that is a part of the Southern Polytechnic
State University (SPSU) Testing System (STS) software development project. The template of
this plan conforms to the Institute of Electrical and Electronics Engineers (IEEE) Standard for
Software Project Management Plans, IEEE Std. 1058-1998, for format and content. The template
and its standard were selected as they are flexible and applicable to any type of project, wellknown to and accepted by many software companies, as well as common to the software
engineering department at SPSU. The TMT Project Manager (PM) assumes responsibility for
this document and updates it as required to meet the needs of the project. Updates to this
document are performed in accordance with the configuration management process defined
within this document, reviewed and concurred by applicable team members listed in the
signature page of this document.
This document relates to other documents as depicted in the following project document tree:
Software Project Management Plan
(TMT-SPMP)
POMA
POMA
POMA
POMA
POMA
POMA
POMA
Page 7 of 60
Page 8 of 60
TABLE OF CONTENTS
SIGNATURE PAGE ...................................................................................................................... 3
LIST OF FIGURES ...................................................................................................................... 12
LIST OF TABLES ........................................................................................................................ 13
1. OVERVIEW ............................................................................................................................. 15
1.1. Project Summary ................................................................................................................ 15
1.1.1. Purpose, Scope, and Objectives .................................................................................. 16
1.1.2. Assumptions and Constraints...................................................................................... 16
1.1.3. Project Deliverables .................................................................................................... 17
1.1.4. Schedule and Budget Summary .................................................................................. 17
1.2. Evolution of the Plan ......................................................................................................... 18
1.3. Document Overview .......................................................................................................... 19
2. REFERENCES ......................................................................................................................... 21
2.1. Standards, Documents, and Other Material ....................................................................... 21
2.2. Deviations and Waivers ..................................................................................................... 21
3. DEFINITIONS.......................................................................................................................... 23
4. PROJECT ORGANIZATION .................................................................................................. 25
4.1. External Interfaces ............................................................................................................. 25
4.2. Internal Structure ............................................................................................................... 26
4.3. Roles and Responsibilities ................................................................................................. 26
5. MANAGERIAL PROCESS PLANS........................................................................................ 29
5.1. Start-up Plan....................................................................................................................... 29
5.1.1. Estimation Plan ........................................................................................................... 29
5.1.2. Staffing Plan................................................................................................................ 29
5.1.3. Resource Acquisition Plan .......................................................................................... 30
5.1.4. Project Staff Training Plan.......................................................................................... 31
5.2. Work Plan .......................................................................................................................... 31
5.2.1. Work Activities ........................................................................................................... 31
5.2.2. Schedule Allocation .................................................................................................... 31
5.2.3. Resource Allocation .................................................................................................... 31
5.2.4. Budget Allocation ....................................................................................................... 31
5.3. Control Plan ....................................................................................................................... 31
5.3.1. Requirements Control Plan ......................................................................................... 32
Page 9 of 60
Page 11 of 60
LIST OF FIGURES
Figure 1.1-1: Project Document Tree ............................................................................................. 7
Figure 1.1-1: System Context Diagram ........................................................................................ 15
Figure 4.1-1: Project External Organization Interfaces ................................................................ 25
Figure 4.2-1: Project Internal Organization Structure .................................................................. 26
Figure 6.1-1: TMT Software Development Life Cycle (SDLC) .................................................. 37
Figure 6.2-1: TMT Requirements Process Flow........................................................................... 38
Figure 7.1-1: TMT Configuration Management Flow .................................................................. 41
Figure 7.4-1: Peer Review Process ............................................................................................... 42
Figure 7.4-2: Walkthrough Review Process ................................................................................. 43
Page 12 of 60
LIST OF TABLES
Table 1.1-1: Project Deliverables ................................................................................................. 17
Table 1.1-2: Master Build Schedule ............................................................................................. 17
Table 1.1-3: Project Budget Summary.......................................................................................... 17
Table 4.1-1: Programmatic Roles and Responsibilities ................................................................ 25
Table 4.3-1: Project Roles and Responsibilities ........................................................................... 26
Table 5.1-1: TMT Team Member Skill Levels ............................................................................. 30
Table 5.1-2: Non-Staff Resource Acquisition .............................................................................. 30
Table 5.3-1: TMT Team Member Contact Information ............................................................... 34
Table 5.3-2: TMT Metrics ............................................................................................................ 35
Table 6.2-1: TMT Programming Languages ................................................................................ 39
Page 13 of 60
Page 14 of 60
1. OVERVIEW
1.1. Project Summary
The SPSU Testing System (STS) is a suite of software testing tools developed by SPSU students
majoring in Computer Science and/or Software Engineering to provide services to businesses
that need software testing. The STS aims to enhance manual software testing (via better defined
process, automatic test case generation, and improve on test cycle speed), improve test
management and control, and improve software product quality (via early defect detection). The
STS is a web-based system operating on a dedicated server which businesses can register to gain
access to different testing tools as described in the following system context diagram:
Req. Spec.,
Design Spec.,
Or Code
Defines test
model
Test Case
Generation Tool
Test Execution
Tool
Test result info
Test Analyst
Test cases
Errors
Updates
test cases
Test cases
Test Management
Tool
Queries
test info
Test info
Test reports
Generate
test reports
Test case
assignments
Assigns
test cases
Testers
Lead Tester
Page 15 of 60
Page 16 of 60
Date
Quantity Acquirer
28-Nov-2010
2
SPSU
28-Nov-2010
1
SPSU
28-Nov-2010
1
SPSU
28-Nov-2010
1
SPSU
28-Nov-2010
2
SPSU
28-Nov-2010
1
SPSU
28-Nov-2010
1
SPSU
28-Nov-2010
1
SPSU
Weekly
14
SPSU
28-Nov-2010
10
SPSU
Media
Document
Document
Document
DVD-ROM
Document
Document
Document
Document
Power Point
Power Point
***Note: The above deliverables are incrementally delivered for reviews. The date specified is
for the final deliverable.
1.1.4. Schedule and Budget Summary
The TMT Team uses MS Excel (as a part of the TMT Teams Metric Program) to track and
manage tasks, resources, schedule, and budget; therefore, the detail Work Breakdown Structure
(WBS), cost, schedule, and staffing resources are defined there. The following tables show the
high-level master build schedule and program budget summary:
Table 1.1-2: Master Build Schedule
Date
20-Sep-2010
27-Sep-2010
04-Oct-2010
11-Oct-2010
18-Oct-2010
25-Oct-2010
01-Nov-2010
08-Nov-2010
15-Nov-2010
22-Nov-2010
29-Nov-2010
06-Dec-2010
Build
Increment
Increment
Increment
Increment
Increment
Increment
Increment
Increment
Increment
Increment
Increment
Increment
01
02
03
04
05
06
07
08
09
10
11
12
Description
Demo of overall system architecture & design.
Demo of Test Case Management Feature.
Demo of TGT & TMT Integration #1.
Demo of Req. Info. Management Feature.
Demo of TGT & TMT Integration #2.
Demo of Test Resource Management Feature.
Demo of Test Result Management Feature.
Demo of Report Generation Feature.
Demo of Data Query Feature.
Demo of TMT full functionality.
Final integration of full functionality for TMT & TGT into STS.
IAB Demo.
Table 1.1-3: Project Budget Summary
Date
Budget
Description
06-Sep-2010 5000.00 Organize development team and develop project plans.
13-Sep-2010 10,000.00 System analysis and planning.
20-Sep-2010 10,000.00 Increment 01: Overall system architecture & design.
Page 17 of 60
Date
27-Sep-2010
04-Oct-2010
11-Oct-2010
18-Oct-2010
25-Oct-2010
01-Nov-2010
08-Nov-2010
15-Nov-2010
22-Nov-2010
29-Nov-2010
Budget
10,000.00 Increment
10,000.00 Increment
10,000.00 Increment
10,000.00 Increment
10,000.00 Increment
10,000.00 Increment
10,000.00 Increment
10,000.00 Increment
10,000.00 Increment
10,000.00 Increment
into STS.
06-Dec-2010 15,000.00 Increment
02:
03:
04:
05:
06:
07:
08:
09:
10:
11:
Description
Test Case Management Feature.
TGT & TMT Integration #1.
Req. Info. Management Feature.
TGT & TMT Integration #2.
Test Resource Management Feature.
Test Result Management Feature.
Report Generation Feature.
Data Query Feature.
TMT full functionality.
Final integration of full functionality for TMT & TGT
Page 18 of 60
Page 19 of 60
Page 20 of 60
2. REFERENCES
2.1. Standards, Documents, and Other Material
The standards and documents listed below are referenced in this document:
[1]
Institute of Electrical and Electronics Engineers (IEEE) Standard for Software Project
Management Plans, IEEE Standard 1058-1998, IEEE, December 1998.
[2]
Model-Based Testing - Next Generation Functional Software Testing. STN 12-4 January 2010:
Model-Driven Development, January 2010.
Page 21 of 60
Page 22 of 60
3. DEFINITIONS
The abbreviations, acronyms, and terminologies used throughout this document are listed below:
Term
ACWP
A/M/D
ASP
BCWP
BCWS
CRN
CSS
EDR
EV
GQM
HTML
IAB
IDE
IE
IEEE
IIS
MS
MTBF
N/A
PM
POMA
QA
SDD
SDLC
SDSR
SPMP
SPSU
SQL
SRS
STD
STP
STR
STS
SUM
SVD
SWE
TBD
TET
TGT
TMT
Definition
Term
T-SQL
WBS
Definition
Page 24 of 60
4. PROJECT ORGANIZATION
4.1. External Interfaces
External organizational interfaces and chain-of-command for the STS project are defined in the
following figure. All organizations defined in the figure are located within the Fall 2010-SWE7903 Capstone Course at SPSU.
The responsibilities of the organizational entities and key positions shown in figure above are
defined in the following table:
Table 4.1-1: Programmatic Roles and Responsibilities
Position
SPSU Software Engineering
Department
Project Advisor (Dr. Frank Tsui)
Test Generation Tool Team
Test Management Tool Team
TGT Project Manager (Stephen E.
Fyffe)
TMT Project Manager (Tom Q. Le)
Roles/Responsibilities
Project Review Board to provide final review on the
completion of the project.
Project advisor to professional consultant to the
development processes and audits the progress of
the project.
A team of 8 undergraduates to develop the TGT.
A team of 5 graduates to develop the TMT and
integration of the STS for demos.
To lead the development effort of the TGT team.
To lead the development effort of the TMT team.
Page 25 of 60
Roles/Responsibilities
Organize and manage the project team.
Define project goals and establish measurements.
Develop project plans and processes.
Define, assign, and monitor tasks.
Monitor work progress and report statuses.
Calculate earned values.
Develop, review, and approve all documents.
Coordinate with all team members for risk
management and issue control.
Manage configurations, builds, and releases.
Review and approve requirements, designs,
codes, test scripts, and test reports.
Assist and be TMT PM's backup.
Work in parallel with the TMT PM in understanding
the project scope.
Initiate and lead the requirements elicitation,
prototyping, and specification.
Coordinate with the Design Lead to ensure the
designs satisfy requirements.
Review requirements, designs, codes, and test
Page 26 of 60
Position
Roles/Responsibilities
TMT Developers
cases.
Assist and be the Requirements Lead's backup.
Involve in requirements elicitations, prototyping,
modeling, specification, and review.
Work in parallel with the TMT Requirements Lead
to design the system architecture and software
designs.
Initiate and lead the design processes.
Coordinate with the TMT Development Lead to
ensure codes satisfy the design and requirements.
Involve in design and code reviews.
Assist and be the TMT Design Lead's backup.
Involve in designs, modeling, and reviewing
designs, and codes.
Work in parallel with the TMT Design Lead to
ensure codes developed in accordance to the
designs.
Initiate and lead the coding efforts.
Coordinate with the developers and testers.
Involve in software reviews.
Assist and be the TMT Development Lead's
backup.
Involve in coding and other reviews.
Work in parallel with the Requirements Lead and
Development Lead to develop test plans and test
cases.
Initiate and lead the test process.
Involve in other software reviews.
Assist and be the TMT Test Lead's backup.
Involve in developing test cases in executing test
cases and collect test results.
Involve in other software reviews.
Page 27 of 60
Page 28 of 60
Page 29 of 60
Harish Sreenivas
Jagati Vempaty
Seth Zaah
Tom Le
Skill Level
8 years at Microsoft experienced in software development involving
in:
Windows, UNIX, Linux, Mac, WP7, and Symbian platforms.
Client-Server and stand-alone models.
Visual Studio 2005 - 2010, Net Beans, and Eclipse IDEs.
Java, C/C++, C# 2.0 - 4.0, HTML, Silverlight, J-Query, PHP,
and SQL programming languages.
MS SQL 2005 - 2008, MySQL, DB2, Posgress SQL databases.
Client machines and Windows Server environment.
Available 20 - 30 hours per week.
No work experience but involved in school projects. Good at
designs, UML, and requirements. Available 20 hours per week.
4 months internship as business analyst and data warehouse
designer. Available 25 hours per week.
More than 8 years in Field Service Support, acknowledged in
Windows, Java, C++, Rational Rose, MySQL, PL/SQL, HTML,
Process Improvement. Available 20 - 25 hours per week (Mon - Sat
from 10:00 AM to 7:00 PM).
Currently work at Lockheed Martin Aeronautics with more than 10
years experience in software project leadership and development
involving in:
Windows, Unix, and Embedded platforms.
Client-Server, stand-alone, and web-based models.
Visual Studio 2000 - 2010, J-Borland, C++ Borland, Rational
Apex, Ultra Edit IDEs.
Java, C/C++, C# 1.0 - 4.0, ADA, Visual Basics, HTML,
JavaScript, VBScript, SQL, T-SQL, PL/SQL, ASP, ASP.NET,
and JSP programming languages.
MS SQL 2000 - 2008, Oracle, and Access databases.
IIS, Tomcat, Apache, and desktop environment.
Available 20 - 30 hours per week.
Date
30-Aug-2010
Cost
$4.95/month
Paid by Tom
Description
MS Visual Studio 2010, an IDE.
Configuration Management Tool (at
http://adam.404.unfuddle.com).
Date
30-Aug-2010
20-Sep-2010
Cost
Free
Paid by Adam
reviewed by both the Project Advisor and the team members. The approved plans are posted to
the team website at http://cse.spsu.edu/ftsui/SWE7903.html. Periodic reviews, weekly status
reports, and monthly assessments (i.e., earned-value and performance) are conducted to assess
the risks, to initiate risk analysis, to establish risk mitigation as well as to provide corrective
actions to the plans. Corrective action items are documented in the TMT Teams Metric Program
and communicated to the TMT Project Advisor and team members.
5.3.1. Requirements Control Plan
The TMT PM and Requirements Analyst are responsible for the project requirement's process.
The Requirements Team specifies all requirements in the Software Requirements Specification.
The TMT Project Advisor, PM, and all technical leads review the requirements for correctness
and ensure the integrity of the requirements specified by bi-directional traceability matrices built
in for each major document (i.e., SRS, SDD, STP, and STD). As traceability matrices are
produced, the Requirements Team is able to determine whether all requirements for each
increment are met. Any updates to the requirements cause a new revision to the SRS.
***NOTE: If time permits, it is also possible to use the newly developed TMT tool to control the
requirement traceability.
5.3.2. Schedule Control Plan
The TMT Team uses the TMT Teams Metric Program to measure the progress or work
completed at the major and minor project milestones, to compare actual progress to planned
progress, and to implement corrective action when actual progress does not conform to planned
progress. Achievement of schedule milestones are assessed using the objective criteria to
measure the scope and quality of work products completed at each milestone as specified in
section 5.3.6: Metrics Collection Plan of this SPMP.
5.3.2.1. Schedule Tracking
Project progress is charted on a weekly basis using the TMT Teams Metric Program. Schedule
performance data is generated at the task level and compared to the proposed schedule. Reports
are generated that provide data for incremental effort estimation adjustments and projected future
performance. The actual start dates, finish dates, task completion percentages, actual cost, and
resources used on each task are recorded in the TMT Teams Metric Program.
5.3.2.2. Schedule Performance Reports
Project schedule status information is measured against the required/planned dates and reports on
performance up-to-date are extracted from TMT Teams Metric Program and presented to the
weekly status meeting with the TMT Project Advisor every Monday at 7:45 PM. Microsoft
Excel is used to calculate the Actual Cost of Work Performed (ACWP), Budgeted Cost of Work
Scheduled (BCWS), Budgeted Cost of Work Performed (BCWP), and Estimate at Completion.
Page 32 of 60
Schedule reviews are conducted by the TMT Project Advisor, PM, and technical leads on a
weekly basis (i.e., every Mondays or Wednesdays). Applicable schedule updates upon reviews
can be made to the TMT Teams Metric Program; the updates are then notified to the client and
all team members. Schedule reviews are done for at least one incremental release ahead to have
early risk mitigation.
5.3.2.4. Progress Variance Monitoring
Actual progress can differ from the planned progress for many reasons, including but not limited
to inaccurate effort estimation, resource limitation, plan and requirement changes, and etc. The
TMT technical leads are responsible to notify the TMT PM and the PM is responsible to update
the applicable scheduled tasks on the TMT Teams Metric Program. Deviations that are beyond
the PMs capabilities to resolve (e.g., member drop outs, non-productive or poor performance,
etc.) are brought to the Project Advisors attention.
5.3.2.5. Progress Variance Resolution
The TMT PM has the authority to reallocate resources, delegate and reschedule the tasks, or
correct performance problems that are not impacting a major release. Otherwise, consultancies
from the TMT Advisor are recommended.
5.3.2.6. Follow-Up on Corrective Action
The TMT Teams Metric Program is used to identify the initial schedule deviation and to analyze
the corrective action results. Corrective action items are closely monitored to ensure that they are
effectively recovering the schedule variance before milestone or the master build schedule is
jeopardized.
5.3.3. Budget Control Plan
The total cost of the project is $140,000.00 spreading out to 14 weeks as shown in Table 1.1-3:
Project Budget Summary. The project cost is calculated based on $100.00 per person-hour
expended for each task by each assigned resource; i.e., $100.00 * effort hour * resource. Periodic
cost analysis and reviews are conducted to assess performance and provide insight to the future
task assignments.
5.3.4. Quality Control Plan
There is no independent Quality Assurance (QA) Group under the TMT management. Instead,
the TMT PM and Testing Group are responsible for assurance of all quality and production
control requirements are being accomplished. The TMT Test Lead is responsible to conduct
various reviews on project plans, requirements, designs, codes, tests, and all other documents. In
performing these duties, the TMT Test Lead monitors adherence to all applicable policies,
processes, procedures, and plans.
Page 33 of 60
The TMT PM generates and provides weekly project status to the TMT Project Advisor on
Monday at 7:45 PM. There are 12 weekly status reports and the detailed of each status report is
noted in the TMT Teams Metric Program under Project Management Tasks. All team members
are required to be present in all weekly status meetings.
5.3.5.2. Internal Reviews
The TMT Team holds in-progress reviews at least once month for the purpose of monitoring and
adjusting the internal process toward meeting the scheduled milestones, reviewing problems
encountered, presenting proposed resolutions, presenting near-term plans, reviewing risks and
mitigation plans, and individual performance assessments using earned value data extracted from
the TMT Teams Metric Program. The TMT PM is also responsible to provide internal reviews
to each team member based on individual in-progress performance and task assignments.
5.3.5.3. Meetings
As the project progress become more aggressive toward the deliverable deadline, the TMT Team
is more likely to call at least one meeting every Wednesday for the purpose of monitoring
progress, work coordination, issues and resolutions, and etc. as needed.
5.3.5.4. Information Repository
All project information needs to be submitted to the TMT PM for reviews and sign-offs. Source
codes need to be uploaded to the test environment at http://www.winhost.com for sharing
purposes among the team members as well as posted on the TMT Teams website at
http://cse.spsu.edu/ftsui/SWE7903.html.
*** NOTE: An alternative to uploading files to http://www.winhost.com, team members can also
upload files to Google Group; however, it is not recommended to for the purpose of having one
central information repository.
5.3.5.5. Communication
In addition to the aforementioned means of reporting and project communication, the TMT
members are recommended to communicate through email and phones as needed. The following
is the listing of the TMT team members contact information:
Table 5.3-1: TMT Team Member Contact Information
Name
Adam Scott
Harish Sreenivas
Email
ascott4@spsu.edu
hsreenivas@spsu.edu
Phone
770-940-0988
678-896-0727
Page 34 of 60
Comment
Busy on Wednesdays.
Work at SPSU Mon Wed,
11:00 5:00.
Name
Jagati Vempaty
Seth Zaah
Email
jvempaty@spsu.edu
sethzaah@yahoo.co.uk
Phone
404-751-7896
678-485-8014
Tom Le
tomle75@comcast.net
404-791-3816
Comment
N/A.
Work at Client Support/Real
Estate, 9:00 AM - 7:00 PM,
Mon - Sat.
Work at LM Aero Mon Fri,
9:00 3:00.
Product Metrics
Metric Name
Cost per Unit
MTBF
Product Size
Performance
Quality Level
Phase-based Effort
EDR per Phase
Process Metrics
Project Metrics
Cost
Productivity
Scope
Purpose
To project future cost estimation.
To collect the mean time between failures to
project product availability.
To provide future effort estimation.
To provide product performance assessment
since it is an online service.
To assess the product quality level.
To assess the effort by various development
phases (i.e., requirements, design, code,
test, and documentation).
To assess the effectiveness of defect
removal per phase.
To assess responses to fixes, future effort
estimations, and early risk mitigations.
To assess future workloads and schedule
risks.
To assess rework.
To assess defects per product size.
To identify root causes for schedule slippage
and work under-estimates.
To identify cost variance and task allocation.
To assess productivity and task reallocation.
To identify changes to project scope.
***NOTE: The detailed GQM metric program is defined in a separate MS Excel file and posted
on the TMT website at http://www.cse.spsu.edu/ftsui/SWE7903.html.
1. Identify risks a week ahead of current build schedule (i.e., Table 1.1-2: Master Build
Schedule).
2. Enter risks in the TMT Teams Metric Program.
3. PM and technical leads analyze risks, determine resolution, and implement resolution.
4. PM and technical leads provide follow-ups on risks resolution results.
Requirements.
Designs.
Test Results.
Code/Product Demo.
Development Status.
Page 36 of 60
Activity
Documents,
Reviews
Documents,
Reviews
Code
Documents,
Reviews
Test
Documents,
Reviews
Install/Support
Documents,
Reviews
Status Report
Increment # 12
SDD
Documents,
Reviews
Design
Increment # 1
Initial Development
SRS
Documents,
Reviews
Requirements
...
Planning
SUM
SDSR
The development of the TMT tool is broken into one initial development and 12 incremental
developments. Each incremental development spans through all activities from requirements,
design, code, test, install/support, and status reporting, except the initial development spans from
the planning activity. The purpose of the initial development is to produce the following the
following software artifacts:
1.
2.
3.
4.
5.
6.
7.
In all other incremental developments, steps 2 through 7 are re-iterated and correspondent
documents are updated.
Version
4.0
4.0
2008
4.0
2.0
1.1
2.0
Description
Used to implement business rules and data access logics.
Used to implement web-based user interfaces.
Used to implement database stored procedures, functions,
and database queries.
Used in parallel with ASP.NET to implement web pages.
Used to control user interface looks and feel.
Used to implement client-side codes.
Used to improve the user interface.
6.2.5. Tools
The TMT Team uses the following tools for the development and testing of the STS overall
framework and the TMT tool:
Tool
MS Visual Studio
Version
2010 Pro
MS SQL Server
2008 Pro
Microsoft Office
2007 Pro
MS IE Browser
MS Firefox
Project Manager
7.0
8.0
2010
Description
An Integrated Development Environment (IDE) used for
development and software builds.
A RDBM system used to develop and test the database
scripts, stored procedures, and user-defined functions.
i.e., Word, Excel, Visio, Power Point, and Outlook used for
documents, metrics, designs, presentations, and
communication respectively.
Used to test the system.
Used to test the system.
An online tool used to manage the work activities.
Page 39 of 60
Page 40 of 60
Format Standard
Estimated
Page
Count
50
50
50
50
50
50
50
50
50
Review Type
Peer Review
Peer Review
Walkthrough
Walkthrough
Peer Review
Peer Review
Peer Review
Peer Review
Peer Review
Page 42 of 60
In a Peer Review, when the author completes a work product he/she notifies the project manager
to initiate peer review. The project manager assures that the work product is in configuration,
properly versioned, and ready for review and then initiates the review by assigning various
reviewing activities to different technical expertise. Each reviewer reviews the assigned part of
the work product to find defects and records them. Once completing the assigned review, each
reviewer submits findings/defects to the author of the work product. The author updates the work
product based on the findings submitted, if applicable. The project manager moderates the entire
review process to assure all finding being updated properly and then closes out the review.
Find defects
Work Product
Completes
Updates
Initiates review
Moderates & closes review
Notifies
Reviewers
Author
Project Manager
The process of a Walkthrough Review is similar to the Peer Review, except that all participants;
i.e., project manager, author, and reviewers, are gathering in a meeting and walk through the
work product to find defects and correct them as applicable. This type of review is more
expensive but it produces more product quality.
Appendix A is the checklist for requirements review.
Appendix B is the checklist for design review.
Appendix C is the check list for code review.
Appendix D is the checklist for test procedure/script review.
Appendix E is the template for how to record a defect.
Appendix F is the template for how to plan and kick off a review.
Appendix G is the template for how to close out a review.
Appendix H is the template for how to resolve a finding.
Work Product
SPMP
Type of Review
Peer Review
Page 43 of 60
Purpose
To agree and follow the plans
Product Type
Work Product
Type of Review
Requirements
Test
Design
Code
Test
Test
Documentation
SRS
STP
SDD
Source Code
STD
STR
SVD
Walkthrough
Peer Review
Walkthrough
Peer Review
Peer Review
Peer Review
Peer Review
Documentation
SDSR
Peer Review
Documentation
SUM
Peer Review
Purpose
and established processes.
See appendix A.
See appendix D.
See appendix B.
See appendix C.
See appendix D.
See appendix D.
To assure that all source codes
produced properly listed.
To assure that the report
properly reflect the development
statuses.
To assure that the manual
clearly explains how to use the
product.
Page 44 of 60
8. ADDITIONAL PLANS
There are no other addition plans.
Page 45 of 60
Page 46 of 60
APPENDIXES
Appendix A: Requirements Review Checklist
Use this symbol to check and use this to uncheck.
Author:
Moderator:
Date:
Purpose:
Review specification quality:
Can each item be implemented and tested?
features?
Is the traceability backward and forward correct?
Has the traceability been captured in the appropriate tool?
Review for areas of future growth:
Have all areas of uncertainty, incompleteness, or areas of future growth been
considered and identified?
Review error handling:
Page 47 of 60
changes?
Have all reasonable anticipated changes been identified?
Are areas of anticipated changes isolated?
Review design considerations:
Is the design complete?
Have all design views been addressed?
Does it perform the specified function?
Does it promote information hiding and reuse?
Is it highly cohesive/loosely coupled?
requirements?
Has this traceability been captured in the design according to standards?
Page 48 of 60
Page 49 of 60
Page 50 of 60
Page 51 of 60
Page 52 of 60
Page 53 of 60
Page 54 of 60
Page 55 of 60
Defect Description
No defect found or comment.
Prevents the accomplishment of an operational or mission
essential capability
Jeopardizes safety
Causes significant technical, cost, or schedule risks to the
project or to life cycle support of the system. This is a MAJOR
defect.
Adversely affects the accomplishment of an operational or
mission essential capability and no work-around solution is
known
Adversely affects technical, cost, or schedule risks to the project
or to life cycle support of the system, no work-around solution is
known. This is a MAJOR defect.
Adversely affects the accomplishment of an operational or
mission essential capability, but a work-around solution is
known
Adversely affects technical, cost, or schedule risks to the project
or to life cycle support of the system, but a work-around
solution is known. This is a MAJOR defect.
Results in user/operator inconvenience or annoyance, but does
not affect a required operational or mission essential capability
Results in inconvenience or annoyance for development or
support personnel, but does not prevent the accomplishment of
those responsibilities. This is a MINOR defect.
Any other effect not described above. Projects may define and
assign sub-codes for this priority. This is a MINOR defect.
Page 56 of 60
Level
None
Major
Major
Major
Minor
Minor
Defect Type
Functionality
1b
GFE/BFE/CFE/Subcontractor
1c
Software Engineering
Environment (SEE)
1d
Programmatic Requirements
1e
Test
Equipment/Environment
Support
Equipment/Environment
Documentation
1f
1z
2a
Subsystem/Software
Requirements
2b
Physical Interface
2c
Functional Interface
2d
User Interface
2e
System Requirements
2z
Documentation
3a
Physical Interface
Description
The definition of a functional capability desired by
the customer was not clear and unambiguous, or
was misinterpreted.
Incompatibilities or deficiencies with specificationcompliant Government Furnished Equipment
(GFE), Buyer Furnished Equipment (BFE),
Contractor Furnished Equipment (CFE), or
subcontractor supplied products in their interaction
with the system.
The SEE, as specified, is deficient in supporting the
software engineering process; incompatibilities
exist between SEE tools.
Conflicting contractual requirements; problems
relating to security requirements; problems
relating to teaming with other companies.
The test equipment/environment, as specified, is
deficient to support system operation.
The support equipment/environment, as specified,
is deficient to support system operation.
Documentation does not accurately/adequately
describe the planning, and categories 1a through
1f are not appropriate.
Incorrect/incomplete translation or specification of
the lower level (subsystem/software) requirements
that was not the result of category 2e. Examples
include: incorrect specification at the SRS level,
subsystem specification incorrect, etc.
An error in the definition or specification of a
physical (hardware) interface.
Incorrect/incomplete specification of the functional
interaction or communication of data with other
processes or subsystems, and categories 2b and
2d are not appropriate.
Incorrect/incomplete specification of the software
interaction with the user.
Incorrect/incomplete translation or specification of
the customer requirements that was not the result
of category 1a. Examples include: incorrect
specification at the SSS level, Air Vehicle
Specification, etc.
Documentation does not accurately describe the
intended requirements, and categories 2a through
2e are not appropriate. Format, documentation
standards, typographical errors, and
understandability are included in this category.
Incorrect/incomplete design of the software
interaction with hardware.
Page 57 of 60
Code
3b
Defect Type
Subsystem Interface
3c
User Interface
3d
Inter-process
Communication
3e
Data Definition
3f
Unit/Procedure Design
3g
Error Handling
3h
Standards
3i
Traceability
3z
Documentation
4a
Logic
4b
Computations
4c
Data Handling
4d
Unit Interface
Description
Incorrect/incomplete design of the functional
interaction or communication of data with other
subsystems or processes, and categories 3a and 3c
are not appropriate.
Incorrect/incomplete design of the software
interaction with the user.
Incorrect/incomplete interfaces or communications
between processes or components within a
product. Includes timing/scheduling anomalies as
well as parameter passing/ omission and other
inter-process defects.
Incorrect/incomplete design of the data structures
to be used in the product, e.g., defects associated
with data precision/accuracy and data units.
Problem with the control flow and execution within
a unit/procedure. For example, errors in
logic/computations in algorithms,
incorrect/incomplete translation of requirements,
incorrect/incomplete sequence of operations in a
test procedure or installation procedure, etc.
The checks for error detection/isolation were not
adequate, or the response to detected error
conditions was not correct.
Violations of design standards that do not cause a
defect in the functional operation of the product.
This does not include documentation standards.
A known requirement was missed or a non-existing
requirement was introduced into the design.
Documentation does not accurately describe the
design or software configuration item (product)
test plans/procedures, and categories 3a through
3i are not appropriate. Format, documentation
standards, typographical errors, and
understandability are included in this category.
Incorrect/incomplete decision logic causes the
product not to implement the full intent of the
design, e.g., invalid "If-Then-Else," loop structures,
case statements, missing "Begin-End," wrong
execution sequence.
Incorrect/incomplete mathematical or arithmetic
operations cause the product not to implement the
full intent of the design.
Incorrect/incomplete definition or use of data
structures in accordance with the design. This
category includes data precision, improper
indexing into arrays or tables, improper scaling or
units of measure, and data incorrectly stored or
accessed.
Problems related to the calling, parameter
Page 58 of 60
Code
Defect Type
4e
Standards
4f
Build/Configuration
4z
Documentation
5z
Documentation
6z
Documentation
7a
GFE/BFE
7b
CFE/Subcontractor
7c
SEE
7d
Other
7e
Test
Description
definition, and termination of subprocesses. This
category includes incorrect order or number of
parameters passed, ambiguous termination value
for a function, incorrect parameter data types,
inter-process communication problems, and
timing/scheduling anomalies.
Violations of programming standards, incorrect use
of the programming language, typographical errors
during code/text entry for source code and test
procedures, etc.
Incorrect/incomplete software build/configuration,
e.g., missing code, wrong version of code,
incorrect placement in memory, missing test
procedure, etc.
Documentation does not accurately describe the
implementation of the design, and categories 4a
through 4f are not appropriate. This category also
includes defects in documentation format or other
documentation standards, typographical errors,
understandability, etc.
Documentation does not accurately reflect test
results. Format, documentation standards,
typographical errors, and understandability are
included in this category. This category does not
include problems with software configuration item
test plans, test cases, or test procedures.
Documentation generated to support the transition
to the customer is in error. Format, documentation
standards, typographical errors, and
understandability are included in this category.
This category does not include problems with
documentation developed in prior phases that are
covered by categories 1z, 2z, ... 5z.
A problem caused by GFE/BFE not operating in
accordance with its specification.
A problem caused by external CFE/subcontracted
equipment/software not operating in accordance
with its specification, e.g., host computer
hardware, firmware, supplier errors, or
performance problems. This does not include
internally maintained reused assets.
A problem caused by the SEE not operating in
accordance with its specification, e.g., compiler
problems, linker problems, interface definition tool
problems, etc. This does not include internally
maintained reused assets.
Other external problem, and no other external
category is appropriate. This does not include
internally maintained reused assets.
The product under test is operating correctly, yet
Page 59 of 60
Code
Defect Type
Equipment/Environment
7f
Support
Equipment/Environment
Comment
Description
the test equipment/environment is operating
incorrectly. This does not include internally
maintained reused assets.
The product under test is operating correctly, yet
the support equipment/environment is operating
incorrectly. This does not include internally
maintained reused assets.
This Finding is only a comment and does not
document a defect or a candidate for a defect.
Page 60 of 60