Sie sind auf Seite 1von 29

Testing

Business Unit (BU): Applications


Date Created: 12/12/2003
Date Last Updated: 15th July 2011
Version: 4.3
Document Number: PM052

Dell Services - Applications


Any improvement suggestion to this document could be mailed to SEPG at DL-QOE-SEPG@Dell.com

Security Classification: Dell Proprietary and Confidential. 2010 Dell Inc. All Rights Reserved.
Document Information

Document Information

Document Control

Version Date Changed Description of Changes Completed By


References to QR014 has been replaced with TP512 as this is now a
4.0 31st May 2010 SEPG
template

Product integration process area gaps identified by SEPG addressed


4.1 30th Nov 2010 G Siva
as a part of Step 2 and 3

4.2 21st April 2011 Removed references to TP003 and TP015. SEPG

4.3 15th July 2011 QR044 is no longer in use. All references thereto have been removed. SEPG

ii of 29 PM052
Contents

Contents

Tables ................................................................................................................................................. v
Chapter 1 Introduction .................................................................................................................. 6
1 Overview ................................................................................................................................... 6
1.1 Objective ........................................................................................................................ 6
1.2 Scope ............................................................................................................................. 6
1.3 Intended Audience ......................................................................................................... 6
1.4 Acronyms ....................................................................................................................... 6
Chapter 2 Procedure ..................................................................................................................... 7
2 Description ................................................................................................................................ 7
2.1 Process Schematic ........................................................................................................ 7
2.2 Entry Criteria .................................................................................................................. 8
2.3 Inputs.............................................................................................................................. 8
2.4 Tasks .............................................................................................................................. 8
2.5 Outputs ........................................................................................................................... 9
2.6 Exit Criteria ..................................................................................................................... 9
Chapter 3 Test Requirement Analysis Phase ........................................................................... 10
3 Description .............................................................................................................................. 10
3.1 Entry Criteria ................................................................................................................ 10
3.2 Inputs............................................................................................................................ 10
3.3 Tasks ............................................................................................................................ 10
3.4 Outputs ......................................................................................................................... 11
3.5 Exit Criteria ................................................................................................................... 11
Chapter 4 Test Planning Phase .................................................................................................. 12
4 Description .............................................................................................................................. 12
4.1 Entry Criteria ................................................................................................................ 12
4.2 Inputs............................................................................................................................ 12
4.3 Tasks ............................................................................................................................ 12
4.4 Outputs ......................................................................................................................... 14
4.5 Exit Criteria ................................................................................................................... 14
Chapter 5 Test Design Phase ..................................................................................................... 15
5 Description .............................................................................................................................. 15
5.1 Entry Criteria ................................................................................................................ 15
5.2 Inputs............................................................................................................................ 15
5.3 Tasks ............................................................................................................................ 15
5.4 Outputs ......................................................................................................................... 17
5.5 Exit Criteria ................................................................................................................... 17
Chapter 6 Test Execution Phase ................................................................................................ 18
6 Description .............................................................................................................................. 18
6.1 Test Execution- Functional/ System Testing................................................................ 18
6.1.1 Entry Criteria ........................................................................................................ 18
6.1.2 Inputs .................................................................................................................... 18
6.1.3 Tasks .................................................................................................................... 18
6.1.4 Outputs ................................................................................................................. 19
6.1.5 Exit Criteria ........................................................................................................... 20
6.2 Test Execution- Non Functional Testing ...................................................................... 20
6.2.1 Entry Criteria ........................................................................................................ 20
6.2.2 Inputs .................................................................................................................... 20
6.2.3 Tasks .................................................................................................................... 20

PM052 iii of 29
Contents

6.2.4 Outputs................................................................................................................. 21
6.2.5 Exit Criteria .......................................................................................................... 21
Chapter 7 Test Maintenance Phase ........................................................................................... 22
7 Description ............................................................................................................................. 22
7.1 Entry Criteria ................................................................................................................ 22
7.2 Inputs ........................................................................................................................... 22
7.3 Tasks ........................................................................................................................... 22
7.4 Outputs ........................................................................................................................ 23
7.5 Exit Criteria .................................................................................................................. 23
Chapter 8 Metrics ........................................................................................................................ 24
8 Metrics .................................................................................................................................... 24
Chapter 9 Tailoring /Waivers ...................................................................................................... 25
9 Tailoring /Waivers Guidelines ................................................................................................ 25
Appendix A Responsibilities ................................................................................................. 26
Appendix B Related Documents ........................................................................................... 27
Appendix C Severity Classifications..................................................................................... 28
Glossary .......................................................................................................................................... 29

iv of 29 PM052
Tables
Table 1 Acronyms .................................................................................................................................. 6
Table 2 Tasks ........................................................................................................................................ 8

PM052 v of 29
Chapter 1 Introduction

Chapter 1 Introduction

1 Overview

1.1 Objective
This artifact provides a systematic approach to analyze a software item aiming to detect the
differences between existing and required conditions (defect) at each phase of the Testing
Lifecycle of a project.

The six phases of the Testing Lifecycle, described as sub processes to the main Testing
procedure in the document includes:
Test Initiation Phase
Test Requirement Analysis Phase
Test Planning Phase
Test Design Phase Test Execution Phase
Test Maintenance Phase
1.2 Scope
To provide with the detailed approach for conducting manual as well as automated testing at
each phase of the Testing Lifecycle.
1.3 Intended Audience
All Dell Services Applications Services Associates
1.4 Acronyms
Table 1 Acronyms

Acronym Description
CC Configuration Controller
DTS Defect Tracking System
PL Project Leader
PM Project Manager
PPD Project Planning Document
QMS Quality Management System
QOE Quality and Operational Excellence
SME Subject Matter Expert
SRS Software Requirements Specifications
TM Team Member
TRIPS Time and Review Information Processing System

6 of 29 PM052
Chapter 2 Procedure

Chapter 2 Procedure

2 Description

In the document, the procedure for Testing has been structured among six sub processes based on
the six phases of Testing Lifecycle namely:
Test Initiation Phase
Test Requirement Analysis Phase
Test Planning Phase
Test Design Phase
Test Execution Phase
Test Maintenance Phase

The picture below describes various phases of Testing Lifecycle for an application
2.1 Process Schematic
Software Development Lifecycle (SDLC)

Initiation Analysis Construction Transition Deliver

Testing Phases based on Lifecycle Model


Test Initiation Defect reports Planning &
Test Initiation Progress status & Reviews
Test Initiation Control
Risks & recommendations
Feedback to project group

Knowledge Acquisition / Test Requirement


Test Analysis
Requirement Analysis
Transfer Test Requirement Analysis
Analysis of artifacts Preparation of test scenarios / cases
Identification of Issues / Test data preparation
clarifications Automation framework development
Resolution Automation test script generation
Test Planning
Perform requirements analysis & Test Planning Manual test execution
Test Planning
traceability Automated test execution
Scope & methodology Defect Management
Risk based Test Strategy Test Methodologies
Metrics Analysis
Set up test objectives & goals Test Design technique evaluation &
Reporting
High Level test scenario analysis Test Design
Development
Test types
Test Development Handling Change Request
identification
Test Environment Setup Test script maintenance
Finalize Test Plan
Create test documents Test Execution
TestwareManagement
Test Execution
Test script maintenance Test Execution

Test
Test
Maintenance
Maintenance

Figure 1 Testing Lifecycle

7 of 29 PM052
Responsibilities

2.2 Entry Criteria


Project is approved and Task Order is available
Key functional & technical specifications/ requirements and other related artifacts are
finalized and received from the Customer
Product/ application is accessible (in case, the construction is already completed
2.3 Inputs
High level Project Plan (schedule)
Requirement, Design or Functional specification documents from the Customer
2.4 Tasks
Table 2 Tasks

Steps Description Responsibility Artifacts/References

Step 1 Prepare a high-level resource plan with the list of resources and PM / Test P3MM03-015_PPD_AS.
arrange for allocations Manager - Project Planning
Document_AS/ TP010-
AppTstM Test Plan
Identify channel for communication and plan which may PM / Test P3MM03-015_PPD_AS.
include the following: Manager - Project Planning
Document_AS/ TP010-
Onsite-offshore coordination AppTstM Test Plan
Network connectivity and remote access of test
environment and designated servers
Voice, video, text messaging channel

Step 3 Plan and execute knowledge transfer activities including: PM / Test P3MM03-015_PPD_AS.
Manager - Project Planning
Review of application related artifacts Document_AS/ TP010-
Application walkthrough by Business SME AppTstM Test Plan

Step 4 Plan for transition activities both at onsite and offshore PM / Test P3MM03-015_PPD_AS.
Manager - Project Planning
Document_AS/ TP010-
AppTstM Test Plan
Step 5 Identify and arrange for the training needs required for the PM / Test P3MM03-015_PPD_AS.
project. It may include: Manager - Project Planning
Document_AS/ TP010-
Training on tools AppTstM Test Plan
Any other project specific training
Step 6 Update P3MM03-015_PPD_AS. - Project Planning PM / Test P3MM03-015_PPD_AS.
Document_AS based on the inputs from the above mentioned Manager - Project Planning
steps Document_AS/ TP010-
AppTstM Test Plan
Step 7 Review and analyze requirement, design and functional PM/ Test Lead / P3MM03-220 Review
specification documents received from the Customer and Test Manager Comment Log - AS
identify issues or items that needs clarifications Review Comment Log
Step 8 Resolve and close all the issues identified PM/ Test Lead / P3MM03-220 Review
Test Manager Comment Log - AS
Review Comment Log

8 of 29 PM052
Chapter 2 Procedure Description

Steps Description Responsibility Artifacts/References

TP512- Technical
Problem/ Issue/ Query
Log

2.5 Outputs
P3MM03-015_PPD_AS. - Project Planning Document_AS
Baselined Software Requirement Specification (SRS) document

2.6 Exit Criteria


Baseline Project Management Plan including the details of the following :
Validated and Documented Knowledge Transfer
Detailed Transition Plan
Identified Training requirements
Baselined Requirement and Functional specifications

PM052 9 of 29
Responsibilities

Chapter 3 Test Requirement


Analysis Phase

3 Description

The objective of this phase is to review the baselined specification documents to identify the test
coverage, scope and risks involved in testing. It also involves performing testability analysis for
various high-level test scenarios identified.
3.1 Entry Criteria
Baselined Requirement and Functional specifications
Baselined Technical specifications
3.2 Inputs
P3MM03-015_PPD_AS. - Project Planning Document_AS
Baselined Software Requirement Specification (SRS) document
3.3 Tasks
Table 3 Tasks

Steps Description Responsibility Artifacts/Reference


s
Step 1 Analyze baselined specification documents for PM/ Test Lead / TP010-AppTstM - Test
Test Manager Plan
Identifying the objectives, scope and coverage for testing.
Identifying of various functions of an application to be
tested and that, which are not to be tested
Verifying the functions to be testable or not
Step 2 Identify the type of testing: Black Box Testing, White Box PM/ Test Lead / P3MM03-
Testing or Structural Testing Test Manager 015_PPD_AS. - Project
Planning
Identify the levels of testing, for example: Unit Testing, Document_AS/ TP010-
Integration Testing, System Testing, and Acceptance Testing as AppTstM Test Plan
per the desired functions. For each level of testing, identify the
Test Readiness Criteria taking in to account the interface Project Specification
compatibility as well (e.g., the software has been successfully documents
walk through and unit tested before being integration testing)
Identify the testing approach so as to Manual Testing or
Automation Testing
Step 3 Identify testing hardware and software requirements, identify and PM/ Test Lead / P3MM03-
maintain testing environments in terms of support, resources, tools Test Manager 015_PPD_AS. - Project
to be used etc Planning
Document_AS/ TP010-
AppTstM Test Plan
Project Specification
documents

10 of 29 PM052
Chapter 3 Test Requirement Analysis Phase Description

Steps Description Responsibility Artifacts/Reference


s
Step 4 Identify risks in terms of business, functional and technical risks PM/ Test Lead / P3MM03-
Test Manager 015_PPD_AS. - Project
Planning
Document_AS
Project Specification
documents
P3MM10-
020_RAIDOMatrix_AS
- RAIDOMatrix_AS
Step 5 Identify high-level scenarios for testing. PM/ Test Lead / P3MM03-
Test Manager 015_PPD_AS. - Project
Prepare Test Traceability Matrix identifying the test Planning
coverage and mapping with the requirements Document_AS
Project Specification
documents
TP007- Traceability
Matrix

3.4 Outputs
High-level Test Scenarios
P3MM10-020_RAIDOMatrix_AS - RAIDOMatrix_AS
TP007- Traceability Matrix

3.5 Exit Criteria


Identified test objective, scope and coverage
Documented risk analysis and strategy
Identified high- level scenarios for testing
Identified hardware and Software testing requirements

PM052 11 of 29
Responsibilities

Chapter 4 Test Planning Phase

4 Description

The objective of this phase is to estimate and plan for testing based on the testing objectives,
testability and other analysis conducted at the Requirement Analysis Phase. This also includes
setting up of the test environment.
4.1 Entry Criteria
Identified test objective, scope and coverage
Documented risk analysis and strategy
Identified high- level scenarios for testing
Identified hardware and Software testing requirements
4.2 Inputs
High-level Test Scenarios
P3MM10-020_RAIDOMatrix_AS - RAIDOMatrix_AS

4.3 Tasks
Table 4 Tasks

Steps Description Responsibility Artifacts/Reference


s
Step 1 Prepare test estimation. Test estimation may be conducted using PM/ Test Project related
the testing estimation models available in QMS: Manager documents
Function Point Based Estimation Model for Testing
(TP032)
PM001- Estimation
Requirement Based Estimation Model for Lifecycle
Testing (TP032A)
Get the test estimation reviewed from the peer reviewers/ peer SG160- Estimation
Test Managers Guidelines for Testing
Step 2 Prepare the Test Plan for the project by incorporating the details PM/ Test Lead / Project related
given in the Guidelines for Testing (SG090) The Test Plan should Test Manager documents
be based on P3MM03-015_PPD_AS. - Project Planning
Document_AS of the project. The Test Plan should include:
Testing criteria, developed and reviewed in consultation TP010-AppTstM - Test
Plan
with the Customer and end users, where appropriate
Testing methods, to be effectively used to test the software
Testing Strategy based on the analysis conducted at the SG090 Guidelines for
Requirement Analysis Phase Test Documentation
Acceptance criteria duly reviewed and approved from the
Customer stating the details of the deliverables, quality &
testing objectives and testing support & resource Test Status report
requirements.

12 of 29 PM052
Chapter 4 Test Planning Phase Description

Steps Description Responsibility Artifacts/Reference


s
Level of testing to be performed,
Sequence for Integration of components is identified For
example: Build document/script
Test coverage approach, e.g. statement coverage, path
coverage, branch coverage
Test deliverables, e.g. Test Plan, Test Cases, Test Reports
Resources required for testing like testing environment,
test team, testing tools etc.
Test preparation tasks which may include preparation of
testing documentation, scheduling test resources,
developing test drivers and developing simulators.
Plan for Defect Management, Risk Management,
Configuration Management etc
Schedule for testing activities
Responsibilities for testing activities.
Step 3 Get the Test Plan reviewed from the peer Reviewer/ peer PM/ PM/ Test Lead / TP010-AppTstM - Test
Test manager/ Test Lead Test Manager Plan
Incorporate review comments if any and get the sign off
Step 4 Send the duly reviewed Test Plan to the Customer and get the PM/ Test Lead / TP010-AppTstM - Test
approval and sign off. Test Manager Plan
Step 5 Bring the Customer reviewed and approved test plan under Test Manager Configuration
configuration management as per the procedure for Configuration /Test Lead/ CC Management Tool
Management (PM321).
Step 6 Update the Test Plan along with the traceability matrix tool PM/ Test Lead / Traceability Tool
whenever there is a change in the software requirements or Test Manager
software design or code or testing environment.
Note: PM to ensure the Test plan revision should also take into
account for regression testing may be required to be conducted, as
appropriate, at each test level. The revised test plan is reviewed,
approved, configured and filed. Traceability Matrix Tool should be
used for both forward and backward traceability.
Step 7 Prepare for Resource ramp-up and get the allocations done based PM / Test TP010-AppTstM - Test
on the testing requirements identified. Manager Plan
The resources are requested in Resource Request Form (QR012)
or a Service Order in PeopleSoft is raised.
System Resource Requirement Form (QR513) is raised.
When testing is to be carried out at Customers facilities, the
Customer should be informed sufficiently in advance and agreed
to.
Step 8 Set up test environment which includes: PM/ Test Lead / P3MM03-
Test Manager 015_PPD_AS. - Project
Test Bed installation and configuration Planning
Network connectivity Document_AS Project
All the Software/ tools installation and configuration Specification
Coordination with Vendors and others documents

PM052 13 of 29
Responsibilities

4.4 Outputs
TP010-AppTstM - Test Plan with defined Test Strategy and Acceptance Criteria
Environment set up
Tools required are set up

4.5 Exit Criteria


Baselined Test Plan
Available Test environment
Test Resources arranged

14 of 29 PM052
Chapter 5 Test Design Phase Description

Chapter 5 Test Design Phase

5 Description

The objective of this phase is to prepare for Test Execution by designing, developing, validating
and Baselining Test Scenarios, Test Cases, Test Scripts and Test Data
5.1 Entry Criteria
Identified Test objectives, scope and coverage
Baselined Test Plan
Available Test environment
5.2 Inputs
P3MM03-015_PPD_AS. - Project Planning Document_AS
TP010-AppTstM - Test Plan
Test Case Design Techniques (SG090- Guideline for Testing)
5.3 Tasks
Table 5 Tasks

Steps Description Responsibility Artifacts/Reference


s
Step 1 Identify various Test Scenarios Test Lead/ Test SRS and other technical
Manager documents
Step 2 Review all the identified Test Scenarios. TM / Test Lead / SRS and other technical
Test Manager, documents
Dev. Lead, BAs,
Architects, SG090- Guidelines for
Customer Testing

Step 3 Get the sign off from the Customer on the Test Scenarios Test Lead / Test Reviewed Test
Manager / PM Scenarios
Incase the Test Scenarios have been received from Customer
itself, Step 1, 2 and 3 can be avoided.
Step 4 Design and develop Test Cases based on the test scenarios, Test Lead/ TM SRS and other technical
guidelines and technical specifications available documents
It is essential to maintain traceability through SRS, design and
code to test cases using tool. Ensure that all the Customer
requirements have been catered to in the test cases. Also ensure SG090- Guidelines for
the following: Testing
Unit test cases are prepared based on the program
specifications
TP012- Test Case
Integration test cases are prepared against the designated Design
version of the SRS document and the software design
document.
The integration test cases shall indicate the reference
section numbers of the design document.

PM052 15 of 29
Responsibilities

Steps Description Responsibility Artifacts/Reference


s
System and acceptance test cases are prepared against the
baselined software and the baselined SRS
Step 5 Review the Test cases using CL006. The responsibility for Reviewer TP012- Test Case
reviewing the test cases should be as follows: Design
Integration Test Cases: Associate responsible for
requirement, design and system and acceptance testing
System & Acceptance Test Cases: Customer /End user CL006- Test Case
Review Checklist
SG090- Guidelines for
Testing
Step 6 Get the sign off from the Customer on the Test Cases PM / Test Lead / Reviewed Test Cases
Test Manager
Note: The Customer sign off criteria for Test case approval would
be required as specified in the Test Plan
Step 7 Prepare Test Data Test Lead/ TM Test Scenarios, Test
Cases
Step 8 Identify Automation requirements for functional and non- Test Manager Test Scenarios, Test
functional testing including: /Test Lead/ TM Cases
Identification of functionalities that can be automated.
Designing Test Automation Framework
Tool evaluation and identification
Development of Test Scripts
Script Integration, Review and Approval
Incase of Load and Performance Testing, identification of
critical conditions that need to be tested for performance
(load, stress, volume under peak and normal conditions)
unless specified by the Customer
Step 9 Baseline validated reviewed and approved Test Scenarios, Test Test Manager / Test Scenarios, Test
Cases and Test Scripts as given under configuration Management Test Lead & CC Cases, Test Scripts
procedure (PM321)
Step 10 Prepare a Traceability matrix mapping the Test Scenarios, Test Test Manager / TP007- Traceability
Cases, Test Scripts to the requirements. Test Lead Matrix
Test Scenarios, Test
Cases, Test Scripts
Specification
Documents
Step 11 Prepare for Test Execution based on the Test Plan by performing Test Manager / TP010-AppTstM - Test
the following tasks: Test Lead / TM Plan
Categorize Test Cases/ Test Scripts on criticality and Test Scenarios, TP012-
complexity Test Case Design, Test
Prioritize Test Cases/ Test Scripts for execution based on Scripts
the risk factors
Test Data
Validate test environment set up
Validate test data
Refer SG090- Guidelines for Testing for Test Case SG090- Guidelines for
Categorization, Prioritization , Test Data Preparation Test Testing
Environment etc
Step 12 Verify the test readiness criteria based on following: TM/ Test TP010-AppTstM - Test
Engineer Plan

16 of 29 PM052
Chapter 5 Test Design Phase Description

Steps Description Responsibility Artifacts/Reference


s
For integration testing, code review and unit testing is Test Scenarios, Test
successfully completed. Cases, Test Scripts
For system testing, integration testing is successfully Test Data
completed.
For acceptance testing, functional and non-functional
testing is successfully completed.

5.4 Outputs
Test Scenarios Functional & Non-Functional
Test Cases - Functional
Test Scripts Functional & Non-Functional
Test Data
Traceability Matrix

5.5 Exit Criteria


Validated and base lined Test Scenarios/ Test Cases/Test Scripts - Functional & Non-
Functional
Updated Traceability Matrix
Test environment validated and ready as per schedule
Test data validated and available
Code promoted to System / Acceptance Test environment(s)

PM052 17 of 29
Responsibilities

Chapter 6 Test Execution Phase

6 Description

The objective of this phase is to verify and validate the system performance so as to ensure that it
is performing as per the acceptance criteria defined and a defect free delivery is made to the
Customer.

The Test Execution phase has been further sub divided into Functional Testing and Non
Functional Testing
6.1 Test Execution- Functional/ System Testing

6.1.1 Entry Criteria

Validated and Baselined Test Plan available


Validated and base lined Test Scenarios/ Test Cases
System Test environment validated and ready as per schedule
Test data validated and available
Code promoted to System Test environment (Build available)

6.1.2 Inputs

Test Scenarios
Test Cases
Test Data
Environment set up
Test Tools

6.1.3 Tasks

Table 6 Tasks

Steps Description Responsibility Artifacts/Reference


s
Step 1 Perform functional/ System Test Execution as per the Test TM/ Test Test Scenarios,
Scenarios and Test Cases identified and defined. Engineer
TP012- Test Case
Log all the test results and prepare Test Logs under TP012- Test Design
Case Design template.
Test Data
If for some reason certain test cases are not executed, reasons are Test Status Report
to be stated under the test logs (TP012). For example, if certain

18 of 29 PM052
Chapter 6 Test Execution Phase Description

Steps Description Responsibility Artifacts/Reference


s
test cases cannot be executed before a defect is resolved; a remark
will be put in TP012, in such cases. These test cases will be
monitored and executed after the defect is fixed.
Step 2 Review Test Logs prepared as part of TP012- Test Case Design Test Lead / Test TP012 Test case
template and validate test results Manager / PM Design
Step 3 Log the errors identified as defects into the Defect Tracking Tool Test Lead/ Test Defect Tracking Tool
Engineer/
Developer Test Scenarios,
TP012- Test Case
Design
Test Data
Step 4 Validate the defects logged under the Defect Tracking Tool to Test Lead/ Test SRS and other technical
qualify it to be a defect Engineer/ documents
Developer
For each defect recorded, classify its severity Level from 1 to 5 as
described in detail under Appendix A- Severity Classifications
Test Scenarios,
Conduct Defect Root Cause Analysis as per PM357- Causal TP012- Test Case
Analysis and Resolution Design
Test Data

Step 5 Validate the system, once the defect is fixed by conducting TM/ Test Test Scenarios,
Regression and Re- Testing Engineer
TP012- Test Case
Close the defect under the Defect Tracking Tool Design
Test Data
Step 6 Report test progress in the form of Test Status Reports (TP028) PM / Test TP028 Test Status
Manager / Test Report
Lead
TP011-AppTstM-
Project Metrics for
Testing
Step 7 Report, capture and analyze Test Metrics PM / Test TP011-AppTstM-
Manager / Test Project Metrics for
Lead Testing
Step 8 Prepare Final Test Summary Report on completion of the System PM / Test TP029- Final Test
Testing. This report acts as a reference point and provides a Manager / Test Summary Report
summary of testing conducted along with details of test strategies, Lead
test tools, etc. It also provides mapping of test status and test
incidents with function/ requirements tested.

6.1.4 Outputs

Functional test results and Status reports


Test Status Report
TP029- Final Test Summary Report
TP012- Test Case Design document with updated execution logs

PM052 19 of 29
Responsibilities

Metric Reports

6.1.5 Exit Criteria

Test Acceptance Criteria has been met.


Final Test Summary Report is available

6.2 Test Execution- Non Functional Testing

6.2.1 Entry Criteria

Validated and Base lined Test Plan available


Validated and base lined Test Cases Test Scripts
Load and Performance test environment validated and ready for execution
Validated System Integration Test results (validate results as a criteria for load and
performance testing)

6.2.2 Inputs

Non Functional Test Cases/ Test Scripts


Test Data
Non Functional Environment set up
Test Tools

6.2.3 Tasks

Table 7 Tasks

Steps Description Responsibility Artifacts/Reference


s
Step 1 Execute Non-Functional Test Cases and Test Scripts TM/ Test Test Scenarios,
Engineer
Log all the test results and prepare Test Logs under TP012- Test TP012- Test Case
Case Design template. Design,

Test Scripts
Test Data

Step 2 Review Test Logs as part of TP012 and validate test results PM/Test TP012 - Test Case
Manager / Test Design
Lead
Step 3 Detect defect/ issues based on any of the factors: Load / Stress / TL/ Developer/ Test Scenarios,
Volume / Localization / Reliability / Compatibility Architect
TP012- Test Case
Log the defects reported in the Defect Tracking Tool Design Test Case,

20 of 29 PM052
Chapter 6 Test Execution Phase Description

Steps Description Responsibility Artifacts/Reference


s
For each defect recorded, classify its severity Level from 1 to 5 as Test Scripts
described in detail under Appendix A- Severity Classifications
Conduct Root Cause analysis for the defects reported as per
PM357- Causal Analysis and Resolution
Step 4 Validate the fine tuned application by conducting Regression and TM/ Test Test Scenarios, Test
Re- Testing Engineer Scripts
Close all the defects reported under Defect Tracking Tool and Test Data
validate the same.

Step 5 Report test progress in the form of Test Status Reports (TP028) PM / Test TP028- Test Status
Manager / Test Reports
Lead
Step 6 Report, capture and analyze Test Metrics PM / Test TP011-AppTstM- Project
Manager / Test Metrics for Testing
Lead
Step 7 Prepare Final Test Summary Report on completion of the System PM / Test TP029- Final Test
Testing. This report acts as a reference point and provides a Manager / Test Summary Report
summary of testing conducted along with details of test strategies, Lead
test tools, etc. It also provides mapping of test status and test
incidents with requirements tested.

6.2.4 Outputs

Non Functional test results


TP028- Test Status Reports
TP029 Final Test Summary Report
Metric Reports

6.2.5 Exit Criteria

Test Acceptance Criteria has been met.


Final Test Summary Report is available

PM052 21 of 29
Responsibilities

Chapter 7 Test Maintenance Phase

7 Description

The objective of this phase is to ensure defect free delivery of the system once the approved
changes have been incorporated into the system.
7.1 Entry Criteria
Changes incorporated in the system under test on Customers change request
Defect fixes that have triggered change in code
7.2 Inputs
Change Requests
Modified SRS and other technical specifications
7.3 Tasks
Table 8 Tasks

Steps Description Responsibility Artifacts/Reference


s
Step 1 Modify all Test Scenarios/ Test Cases/ Test Scripts, Traceability TM/ Test SRS and technical
Matrix based on the changes made to the SRS and technical Engineer specifications
specifications as a result of Customers change request.

Test Scenarios, Test


Cases, Test Scripts
Step 2 Review all the Test Scenarios/ Test Cases/ Test Scripts PM / Test SRS and technical
considering the changes made to the specification documents on Manager / Test specifications
Customers change request Lead

Test Scenarios, Test


Cases, Test Scripts
Step 3 Base line test scenarios / cases / scripts and bring them under PM / Test SRS and technical
Configuration Management as per PM321 Manager / Test specifications
Lead / Customer
/ CC
Test Scenarios, Test
Cases, Test Scripts
Step 4 Conduct System Testing for all the Test scripts and report for any Test Engineer
further modification considering Customers change request.
Step 5 Move to Section 2.5 to Test Execution sub-process and conduct Test Engineer
re-testing once the code has been modified based on the
Customers change request

22 of 29 PM052
Chapter 7 Test Maintenance Phase Description

7.4 Outputs
TP012- Test Case Design document with updated execution logs
TP028- Test Status Reports
TP029 Final Test Summary Report
Metrics Report
7.5 Exit Criteria
Test Acceptance Criteria has been met.
Test Summary Reports are available

PM052 23 of 29
Responsibilities

Chapter 8 Metrics

8 Metrics

For all the items tested, compute:


The number of defects for each severity level from 1 to 5
Total actual efforts spent to fix the defects
Sum of phase differences between the phases in which the defect has been detected and the
phase in which it has been introduced

24 of 29 PM052
Chapter 9 Tailoring /Waivers Tailoring /Waivers Guidelines

Chapter 9 Tailoring /Waivers

9 Tailoring /Waivers Guidelines

This section describes the tailoring process to be carried out for the test execution.
The system testing and acceptance testing phases can be combined together under the
following situations:
As per the Customers requirements
If the Dell Services AS team is working on-site and/or on Customers machine,
with the Customers concurrence
In case a test tool is used, reports generated out of the tool may be used instead of test log and
test incident report

PM052 25 of 29
Responsibilities

Appendix A Responsibilities
Roles Responsibilities
PM / Test Manager Identify the Test Strategy and Acceptance Criteria
Prepare Test Plan
Briefing the team on test documentation
Prepare Test Summary Reports
Liaison with Tool vendors & Infrastructure
PL / Test Lead Conduct Test Readiness Review
Support PM in preparation of Test Strategy, Test Plan and Defect
Root Cause Analysis
Support test team and facilitate testing
Maintain and update for any changes
Create and /or maintain test environment Database setup, Tool
setup etc.
Management of defects using a defect management tool
Regular status reports to Test / Project Manager
PM / Test Manager /PL Plan for knowledge transfer and transition activities
Prepare Test Plan and Test Strategy
Conduct Traceability Analysis
Review Test Log
Conduct Defect Root Cause Analysis
Collate Test Summary Reports for deriving Metrics
Prepare Final Test Summary Report
Test Engineer/ TM Study test documentation including Test strategy, Test Plan, Test
Scenarios etc
Prepare Test Scenarios and Cases and get them reviewed from the
reviewer
Conduct Testing
Document the test results in Test Log.
Log, update and close defects as per retest using a defect tracking
tool
CC Configure all test related documents including Test Data and Test
Results

26 of 29 PM052
Chapter 9 Tailoring /Waivers Tailoring /Waivers Guidelines

Appendix B Related Documents


Document Number Document Name
CL006 Test Case Review Checklist
HB008 Application Testing - Handbook
P3MM03-015_PPD_AS Project Planning Document_AS
P3MM03-220 Review Review Comment Log
Comment Log - AS
P3MM10- RAIDOMatrix_AS
020_RAIDOMatrix_AS
PM001 Estimation
PM312 Review Procedure
PM321 Configuration Management
PM354 Software Metrics
PM357 Causal Analysis and Resolution
QR012 Resource Request Form
QR043 Review Summary Sheet
QR051/CMTool Configuration Register
QR513 System Resource Requirement Form
SG090 Guidelines for Testing
SG160 Estimation Guidelines for Testing
TP007/Tool Traceability Matrix
TP008 Software Requirement Specification
TP010-AppTstM Test Plan
TP011-AppTstM Testing Project Metrics
TP012 Test Case Design
TP028 Test Status Reports
TP029 Final Test Summary Report
TP032 Function Point Based Estimation Model for Testing
TP032A Requirement Based Estimation Model for Lifecycle Testing
TP512 Technical Problem/ Issue/ Query Log

PM052 27 of 29
Severity Classifications

Appendix C Severity Classifications


N: The below Severity Classifications can vary based on the requirements of the project
Severity Description of Business Impact Examples of Business Impact(where
appropriate)
Severity 1 A total unplanned system outage which The entire system is inoperable
affects multiple users, performing critical
System Critical functionality where there is no workaround Significant, and unrecoverable, data loss
System Crash
Severity 2 Impairment of critical system functions. No Critical or frequently used parts of the system
workaround exists, or workaround is are impaired or inoperable
Major Functional cumbersome and causes an impact on
productivity Data corruption which has a critical business
impact
Some types of processing cannot be done
Limited use in Production some although processing could continue in other
requirements not met areas
Severity 3 Impairment of less critical system functions. The defect would reduce effectiveness, but a
Useable in Production with some procedural workaround is sustainable and there is no
Minor Functional workarounds threat to the ability to serve customers
Data corruption which may compound until
fixed, but which can be recovered
The function(s) will not perform as expected
and the business impact is moderate Workaround is defined as something readily
apparent to a user with basic knowledge of the
product, or easily explained in a manual or
over the phone
Severity 4 Inconvenience, annoyance or cosmetic Problems which have no effect on the
functioning of the system e.g. screen/field
Cosmetic
layout, colors etc and doesnt materially
impair the user
Severity 5 Improvements/ Problems which have no effect on the
functioning of the system e.g. screen/field
Enhancement Suggestions
layout, colors etc and doesnt materially
impair the user

28 of 29 PM052
Chapter 9 Tailoring /Waivers Tailoring /Waivers Guidelines

Glossary

Term Definition
Test Case A document specifying inputs, predicted results, and a set of
execution conditions for a test item.
Test Item Software item that is an object of Testing.
Test Plan A document describing the scope, approach, resources, and
schedule of intended Testing Activities.
Test Defects Handling Test defects go through following steps:
Defect Entry - Tester enters a brief description of the defect, the
item tested, item version / build, test phase and the test case id,
which failed. Error log file(s) / Screen (s) that can help to
reproduce and analyze the defect should be attached.
Defect Analysis Analyze the defect for its impact and validity
(is it a real defect?). Valid defects are assigned to a developer.
The defect is classified on a severity level of 1 to 5
Defect Fixing After fixing the defect, the developer enters the
summary of changes made in the code to correct the defect.
Verification Tester verifies the defect fixing by retesting
against the appropriate test cases.
Closure After verification of defect fixing, the defect is
closed.

PM052 29 of 29

Das könnte Ihnen auch gefallen