Sie sind auf Seite 1von 13

Sample Test Plan

Prepared for

[Customer Name]

Project

[Project Name]

Prepared by
[Author]

Contributors
[Document contributors]

1 Table of Contents
1

TABLE OF CONTENTS ................................................................................................................................................ 2

TEST PLAN OVERVIEW ............................................................................................................................................... 3

OBJECTIVES................................................................................................................................................................. 3

TESTING APPROACH .................................................................................................................................................. 3


4.1
OVERVIEW ............................................................................................................................................................... 3
4.1.1
Dependencies and Entrance Criteria ......................................................................................................... 4
4.1.2
Acceptance Criteria...................................................................................................................................... 5
4.2
ASSUMPTIONS .......................................................................................................................................................... 6

TESTING DELIVERABLES AND RESPONSIBILITIES ............................................................................................... 6


5.1
5.2
5.3

TESTING DOCUMENTATION (SCRIPTS) ....................................................................................................................... 6


TEST DATA ............................................................................................................................................................... 7
TEST EXECUTION ..................................................................................................................................................... 7

TESTING RESOURCES ................................................................................................................................................ 7


6.1
RESOURCE REQUIREMENTS ...................................................................................................................................... 7
6.1.1
Environmental .............................................................................................................................................. 7
6.1.2
Staffing and Training ................................................................................................................................... 8
6.1.3
Testing Tools ................................................................................................................................................ 8

TESTING SCOPE .......................................................................................................................................................... 9

TESTING PROCEDURE AND WALKTHROUGH......................................................................................................... 9


8.1
PROCEDURE AND ACTIVITIES BY TESTING TYPE ......................................................................................................... 9
8.1.1
Feature Testing............................................................................................................................................. 9
8.1.2
Unit Testing................................................................................................................................................... 9
8.1.3
Function Testing ........................................................................................................................................ 10
8.1.4
Sub-Process and Process Testing ........................................................................................................... 10
8.1.5
DAT Testing ................................................................................................................................................ 10
8.1.6
Integration Testing ..................................................................................................................................... 11
8.2
TEST SCRIPT EXECUTION (CUSTOMER EXECUTED TESTS)........................................................................................ 11
8.3
TEST RESULT TRACKING ......................................................................................................................................... 11

SCHEDULES ............................................................................................................................................................... 13

2 Test Plan Overview


Successful and comprehensive testing of system modifications and enhancements is crucial to both customer
acceptance and building customer confidence. The purpose of this document is to establish the high-level plan for testing.
This document will also establish general standards and procedures that are to be followed when conducting software
testing and validation for [Customer Name] business processes implemented within Pastel Evolution. The project team is
responsible for testing and validating new functions prior to a production environment and delivery to the end-users. The
primary scope of this test is to establish core business functions for [Customer Name] within Pastel Evolution.
This document is not intended to address every element of testing at a detailed procedural level. Rather it is intended to
outline a reasonable set of procedures that are representative of [Project Name], which will deploy the Pastel Evolution
solution in a production setting. System modifications will be tested several times during the course of development and
deployment. Each successive testing step needs to have specific testing data, criteria and benchmarks defined, as well
as clearly defined passing conditions. The tests will be exercised against carefully selected data that also represents a
cross section of business information processed by [Customer Name].
Each test scenario (test script) documents the objectives of the test, expected results, actual results and associated test
data. Any issues or problems that arise during testing will be logged and tracked to ensure satisfactory and complete
resolution. Details of each will be documented in a table within each test script.
This document is intended for all project team members. It is important for [Customer Name] to review this test plan and
to confirm that the proposed test strategy reasonably represents their testing requirements.

3 Objectives
The overall objective of the testing activities of the [Project Name] project is to validate that the Pastel Evolution solution
configuration, custom developed features, and developed integrations function as specified by [Customer Name]. Specific
business scenarios must be selected and the data that relates to these must be chosen carefully. Execution of test
scripts, and the steps that comprise them, must be planned so that the selected business scenarios can be
demonstrated.
The scope of testing determines the activities that must occur. Each activity and step has formal entrance and exit
criteria.

4 Testing Approach
4.1 Overview
Testing activities extend from the implementation and deployment of the Pastel Evolution solution through to the Go-Live
Phase. Activities for creating, updating or executing test scripts are discussed in the Design & Build phase. The Testing
activities in the Design & Build phase include:

Conducting feature testing of the standard solution


Evaluating the results
Making the required adjustments to configuration
Identifying the process-testing scenarios
Creating test scripts for custom code (for unit testing and function testing)

The testing activities in the Design & Build Phase include the following tests:
Feature Testing Stand-alone testing of the system configuration, performed during the Configuration process
by the application consultants.

Sub-Process Testing Testing of related features that make up a defined business process, performed during
Configuration by the Customer and the Application Consultants.

Unit Testing Stand-alone testing of the system modification (custom code), performed during Design & Build,
by the developers.
3

Function Testing Stand-alone testing of the system modification (custom code), performed during Design &
Build by the application consultants.

Process Testing Complete testing of related features and functions that make up a defined business process,
performed during Design & Build by [Customer Name].

Data Acceptance Testing (DAT) Testing performed by Data Owners and Key Users in the Design & Build
phase prior to or during integration testing. During DAT, [Customer Name] not only verifies the data migrated but
also validates that the data may be inquired upon, reported upon, and transacted upon.

Integration Testing Integrated testing of business processes performed by the Key Users prior to system signoff. This form of testing focuses on end-to-end business processes including development, interfaces, reports,
and integrations to external systems.
o
o
o
o

Testing during the Design & Build phase involves the following processes:
Validating that the configured system under load will meet the performance metrics
Confirming that the overall setup and configuration of the system meets the customers business
requirements
Pre-approval for User Acceptance sign-off

Performance Testing will test business processes and integration. This process will focus on high-transaction
volume that is anticipated during peak times. This process will help validate that the system performance meets
the business requirements.

User Acceptance Testing (UAT) will be the final testing performed by the Key Users prior to system sign-off.
The End Users selected to perform the UAT must receive appropriate training prior to the start of the UAT.

4.1.1

Dependencies and Entrance Criteria

Formal delivery, review and formal acceptance of test plan

Delivery and review of finalized [Customer Name] prepared test scripts prior to the associated testing activity (i.e.
process test scripts must be finalized before process testing can be executed).

Testing Activity

Requisite Pre-Conditions

Conduct Feature Testing

- The Test Plan has been reviewed


- Functional Design Document reviewed
- Completed configuration of the proposed feature in the Pastel Evolution
solution
- The feature has been configured in appropriate Test environment
- The Test Plan has been reviewed
- Contributors have been provided an overview/understanding of Process
Testing
- The Functional Design Document/Solution Design Document has been
reviewed
- The Test Plan has been reviewed
- Solution Design Document and Technical Design Document reviewed
- Data migration and mapping rules have been designed and confirmed

Develop Process Test Scripts

Develop Data Acceptance


Test Scripts
Develop Integration Test
Scripts

Conduct Unit & Function


Testing
Develop Performance Scripts

- The Test Plan has been reviewed


- Solution Design Document and Technical Design Document reviewed
- The integration/interface has been configured in the appropriate Test
environment
- The Test Plan has been reviewed
- Solution Design Document and Technical Design Document reviewed
- The custom feature has been configured in the appropriate Test environment
- The Test Plan has been reviewed
- Functional Requirements Document reviewed
4

Develop User Acceptance


Scripts
Conduct Sub-Process &
Process Testing

Conduct Data Acceptance


Testing

Conduct Integration Testing

Conduct Performance Testing

Conduct User Acceptance


Testing

4.1.2

- The Test Plan has been reviewed


- The Functional Design Document/Solution Design Document has been
reviewed
- The Test Plan has been reviewed
- Completed configuration of the proposed feature
- Completed custom code development of the proposed function
- Completed integration and interface development for the proposed function
- Feature, Unit and Function Testing have been successfully completed
- Process Test scripts have been written
- Functional Requirements document documents reviewed
- The feature has been configured in the appropriate Test environment
- The Test Plan has been reviewed
- Functional Requirements and Functional Design Documents reviewed
- Data Migration Requirements reviewed
- Data Migration Test Scripts have been developed
- Source System/Legacy data has been gathered and cleansed prior to
migration
- Data Migration Mapping has been completed and transformations executed
upon the source data
- The Test Plan has been reviewed
- Completed configuration of the proposed feature.
- Completed custom code development of the proposed function.
- Integration Testing Scripts are Defined and Finalized
- Security has been configured and activated in the system
- The Test Plan has been reviewed
- Performance test scripts have been developed
- The overall functionality of the system has been demonstrated and the users
have completed Integration Testing successfully

- The Test Plan has been reviewed


- UAT test scripts have been developed
- The Quality and Testing Requirements (Test Plan) has been reviewed
- The corresponding Testing environment has been setup and configured
- Data has been migrated to the corresponding Testing environment
Table 1: Test Activity Dependencies & Pre-conditions

Acceptance Criteria

Formal review and acceptance of test script results.

All Test Strings and Test Steps have been executed successfully, or reasons for non-execution documented and
approved.

All severity 1 (severity is defined in section 8.3) failed tests reported in the Test Scenario Summary Reports have
been resolved, tested successfully and closed.

All severity 2 failed tests reported in the Test Scenario Summary Reports are closed or have an action plan in
place for closure prior to the closure of Integration testing.

All severity 3 and 4 failed tests reported in the Test Scenario Summary Reports are closed or have an action plan
in place for closure prior to the Go-Live Date or within a period of time approved by [Customer Name].

4.2 Assumptions

[Partner Organization] will conduct Unit and Feature testing. [Partner Organization] and [Customer Name] will
conduct Sub-process and Function testing.

[Customer Name] shall create, document and finalize under guidance from [Partner Organization] the UAT
scripts during the design phase of the project. The UAT Test Scripts shall conform to the scope within the System
Blueprint. The conformance will be tracked with traceability matrices between the Blueprint and UAT Test Scripts.
Any changes to the UAT scripts shall follow the Change Management process.

It is essential that [Customer Name] retains ownership of all deliverable testing activities. Where required,
[Partner Organization] will provide a resource to assist in executing the tests.

[Customer Name] shall deliver the UAT Test Scripts/Cases before the end of the Design & Build phase. Any
delays in delivering the UAT Test Cases/Scripts may adversely impact the timeline, cost and quality of the
delivered solution.

[Customer Name] shall be responsible for organizing and executing the user acceptance tests, and logging
defects in defect tracking logs identified by [Partner Organization] and [Customer Name].

[Customer Name] will be responsible for End-User Acceptance.

5 Testing Deliverables and Responsibilities


Responsibilities are often described using a Resource Assignment Matrix (RAM). A specific form of a RAM is the RACI
Matrix as shown below. The matrix shows what role is responsible, accountable, consultative and informed.
RACI-Matrix:
R - Responsible, implying the main responsibility in doing/delivering effort and skill.
A - Accountable, implying management (overrides R when both are implied).
C - Consultative, implying assistance (both active and advisory).
I - Informed, implying a requirement of the one responsible to report on the task.

5.1 Testing Documentation (Scripts)


A test script is a document which outlines the step-by-step actions required to complete a specific test (feature, subprocess, process, integrated string of processes).
Each script must meet the following requirements

Uniquely numbered and named


Identifies the functional/business domain
Describes the testing purpose
Specifies the environment to be used
Provides the data requirements
Outlines any dependences
Lists each required execution step.

A Microsoft Office Excel spreadsheet can be used to document test scripts for future execution.Optionally, an
accompanying Microsoft Office Word document can be constructed which includes screenshots to help testers execute
the script steps and an area for testers to paste screenshots of test results. The table below lists the test script
development activities and the associated responsibilities for [Partner Organization] and [Customer Name].

Test Script Development Activity

Responsibility
Error! Unknown
document property
[Partner
name.
Organization]

Develop Sub-Process and Process Test Scripts

C, I

R, A, C, I

Develop Integration Test Scripts

C, I

R, A, C, I

Develop Data Acceptance Test Scripts

C, I

R, A, C, I

Develop Performance Scripts

C, I

R, A, C, I

Develop User Acceptance Scripts


C, I
R, A, C, I
Table 2: Test Script Development Activity RACI Matrix
[Customer Name] will reference and leverage the Functional Design Document, Future State Business Process Flows,
and Solution Design Document when developing the requisite test scripts.

5.2 Test Data


Test data for Accounts, Contacts, Properties, and Named Roads will be provided via the integration with <<Legacy
System>> as per the data migration strategy. [Customer Name] will be responsible for having all required data cleansed,
transformed, and entered into <<Legacy System>> prior to process testing. [Customer Name] will be responsible for
providing test data for test script execution prior to commencement of process, integration, data acceptance,
performance, and user-acceptance testing.

5.3 Test Execution


With the exception of unit and feature tests conducted by [Partner Organization], all other forms of testing will be primarily
executed by [Customer Name] key users.
Responsibility
Testing Execution Activity

[Partner
Organization]

[Customer Name]

Conduct Unit and Feature Testing

R, A, C, I

C, I

Conduct Process Testing

C, I

R, A, C, I

Conduct Data Acceptance Testing

C, I

R, A, C, I

Conduct Integration Testing

C, I

R, A, C, I

Conduct Performance Testing

R, A, C, I

R, C, I

Conduct User Acceptance Testing


C, I
R, A, C, I
Table 3: Test Execution Activity RACI Matrix

6 Testing Resources
6.1 Resource Requirements
6.1.1

6.1.1.1

Environmental

Hardware and Software

The TEST environment will be used for the execution of process, integration, data acceptance, performance and user
acceptance testing. The TEST environment will be virtualized and require a test instance of the Pastel Evolution solution
and integrated legacy system(s).
As per section 6.1.3.1, the Performance Testing Toolkit for the Pastel Evolution solution will need to be installed in the
TEST environment prior to conducting performance tests. In order to install the performance toolkit, [Partner
Organization] will be responsible for setting up the prerequisite software (e.g. Microsoft Visual Studio 2005 Team
System) to support this tool.

6.1.1.2

Data

Test data for <<specify record data>> will be provided via the integration with <<legacy system(s)>> as per the data
migration strategy. [Customer Name] will be responsible for having all required data cleansed, transformed, and entered
into <<legacy system(s)>> prior to process testing. [Customer Name] will be responsible to provide test data for test script
7

execution prior to commencement of process, integration, data acceptance, performance, and user acceptance testing
(see also section 5 ).

6.1.1.3

Documentation

Test scripts (see section 5.1) will be the key form of documentation to aid the execution of testing the developed The
Pastel Evolution solution system. Standard training materials can be referenced to assist users during the execution of
Process and Integration tests, while [Customer Name] custom developed training documentation can be leveraged during
User Acceptance Testing.
All test results will be documented as per the process indicated in section 8.3.

6.1.1.4

Resource Sharing Requirements

The provision of sufficient client environments for accessing the TEST environment will be the responsibility of [Customer
Name]. It is required that [Customer Name] set up a minimum of [##] online Outlook client environments, one offline
(laptop) Outlook client environment to adequately execute testing. The remainder of users can leverage the web client for
conducting the applicable testing cycle.
[Customer Name] will need to provide adequate room facilities to accommodate the various testing activities.

6.1.2

Staffing and Training

The following table outlines the role and expected resource count from [Customer Name] required to execute the listed
testing activities.
Testing Activity

Role
Key User

Count
6

Required Training
Core Team Training

Key User

Core Team Training

Develop Integration Test Scripts

Key User

Core Team Training

Develop Performance Scripts

Key User

Core Team Training

Develop User Acceptance Scripts

Key User

Core Team Training

Conduct Process Testing

Key User

Core Team Training

Conduct Integration Testing

Key User

Train-the-Trainer Training

Conduct Performance Testing

Key User

Train-the-Trainer Training

Develop Process Test Scripts


Develop Data Acceptance Test
Scripts

Conduct Data Acceptance Testing

Key User, End


10
Train-the-Trainer Training
Conduct User Acceptance Testing
User
Table 4: Staffing and Training Requirements

6.1.3

6.1.3.1

Testing Tools

Performance Testing

Performance/Load Testing is best done using appropriate platform specific tools that place significant stress upon each
function of the system. Using tools to automate performance testing helps ensure repeatability, reduces errors, and
removes the otherwise heavy resource requirements to execute this form of testing. With the Pastel Evolution solution,
the <<input performance testing tool>> can be used to formalize the performance testing of the Pastel Evolution solution.
Using the provided tools, the workload is identified and the workload scripts are developed. These scripts are run by the
load testing framework of Visual Studio and present the System Under Test (SUT) with a stream of requests that mimic
what the Pastel Evolution solution would encounter during typical business operation. Additionally, both the number of
8

simulated users and the rate at which those users perform their actions can be controlled to determine sizing and
scalability limits for a given system and workload.

Required Tools

Location

Existing or Requires Install/Dev

<<Performance Testing
Tool>>

- TEST Environment

Required installation in the TEST


environment
Table 5: Required Testing Tools

7 Testing Scope
The following items are in scope for the testing cross-phase activity as it pertains to the approach, assumptions and
responsibilities outline earlier in this document:

Testing of the Pastel Evolution solution transaction-level processing in Test Steps.

Testing of end-to-end Business Scenarios identified as Test strings of a culmination of test scripts.

Testing utilizing a representative sample of <<input record types>> data.

Testing of the Pastel Evolution solution standard and ad-hoc reporting.

Testing of the Pastel Evolution solution configured views and interface changes.

Testing of custom developed code developed by the project delivery team.

Testing of custom developed integrations with legacy systems by the project delivery team.

Testing of the Pastel Evolution solution system performance by the project team, as it pertains to business
processes.

Documentation of Test Results in test script documents

8 Testing Procedure and Walkthrough


8.1 Procedure and Activities by Testing Type
8.1.1

Feature Testing

Stand-alone testing of the configured feature should take place in the TEST Environment. The goal of this testing is to
validate that the Pastel Evolution solution configuration and sample customer data meets [Customer Name]s business
process requirements. During this testing, the application consultant(s) should test all data validation aspects of the
feature, as well as any functionality contained wholly within the feature. It is also preferred that Key Users be participant
in executing this testing.
The end result of this testing is to ensure that the configured features have fully been tested with a degree of confidence
that any subsequent issues are the result of the features interaction with other components of the environment, and not
with the feature itself. It should be noted that in some cases stand-alone feature testing is not feasible due to the design
of the feature. In this case, the feature should be introduced into the Integration testing process and feature testing should
be performed at that time.
8.1.2

Unit Testing

The Unit Testing process starts with the [Partner Organization] developer conducting Unit Tests of the custom code to
detect and resolve any issues. Since the testing is being conducted by the developer, any anomalies or issues
discovered during this testing will be resolved without referring the feature to a previous step if possible.
In the Pastel Evolution solution, there are a number of different procedures to make developing and testing scripts easier.
Some of those procedures are listed below:
9

Preview The preview feature in the Entity Form Customization page can be used to test the code for the
OnLoad, OnSave, and OnChange while developing it. Code should include conditional statements to test for all
the possible FormType properties. Also, the code always should be tested in the application after publishing the
entity customizations to confirm that it similarly behaves as it did in the preview (see function testing details
below).

Alerts The Jscript alert() method can be used to test values while developing code, but comments on these
alerts need to be removed before finishing.

Script Editor The Microsoft CRM Event Details Properties windows should not be used to write the code. It is
better to use an external Script editor (such as Microsoft Visual Studio, Microsoft FrontPage, or Notepad) and
paste the scripts into the window.

Reference External Scripts The OnLoad event enables the ability to inject an HTML<script> element to the
head of an HTML document. This allows the developer to define functions in a separate Jscript file and load that
script with the page when the form loads. Because these functions are included when the form is loaded, those
functions can be called from any form or field event for that entity. By taking this approach, the amount of Jscript
that is put into the Event Detail Properties window can be reduced. The entity does not need to be republished for
any changes defined in the functions to take effect; the Jscript simply needs to be saved.

8.1.3

Function Testing

Following Unit Testing, the developer and potentially the functional consultant(s), conducts Function Testing of the code
in a TEST environment. Prior to beginning this testing, it is recommended that the testers review the functional
requirements document relative to the function being tested to ensure they have an understanding of the required
functionality.

8.1.4

Sub-Process and Process Testing

Since all possible variants of key design requirements will consume significant amounts of time [Customer Name] will
concentrate on a representative smaller subset of all possible variants, based on the primary Pastel Evolution solution
design requirements and the more common processes with the highest business volumes.
Sub-process testing is conducted to validate that, for the corresponding business process, within a larger process
framework, both the configuration of the Pastel Evolution solution and the custom code development meet [Customer
Name]s business process requirements. Sub-process testing may be required to test individual processes within a larger
process prior to executing the holistic process test.
Process Testing is the complete testing of related features and functions that make up a defined business process,
performed during Design & Build by [Customer Name]. Process testing validates that both the configuration of the Pastel
Evolution solution and the custom code development, for the corresponding business process, meet business process
requirements. An example of a Process Testing scenario would be one that covers the Create a Case workflow, or one
that validates the functionality with a specific Pastel Evolution solution entity. While it is imperative that Process Testing
verify the functionality of all aspects of the solution being developed, it is not meant to be a system performance testing
session. The Key Users and Subject Matter Experts, if necessary, will identify the Process Testing scenarios based on
their To Be process flows and make any changes after reviewing them. The scenarios should identify all functionality
required to support their key Business processes. The level of detail in the Process scenarios may very as required.
The business process scenarios will require the Key User to validate data and validate expected results. All steps must
be correct in order for the process script to pass. The Test script will require the use of corresponding data that is
required for the testing. [Customer Name] will create, update and validate data to ensure adequate support of test script
requirements. The criteria established in this activity of the Design phase are the basis for testing results comparison from
tests executed in the Design & Build phase.

8.1.5

DAT Testing

[Customer Name] Key Users will perform data analysis of the solution, in accordance with the data migration
requirements and data migration strategy. During DAT, the customer not only verifies the data migrated but also validates
that the data may be inquired upon, reported upon, and transacted upon.

10

8.1.6

Integration Testing

[Customer Name] Key Users will execute Interfaces and Integration Test Scripts for end-to-end business processes
testing. An example of this test would include a script that ties together the complete customer service contact
management and request management processes.
This testing is conducted with the application security turned on, and is conducted in the TEST Environment. Each aspect
of feature security needs to be tested to ensure there are no issues, and to validate that the approved security design
was implemented correctly. In addition to the stringent testing requirements needed for custom features, standard
features or enhancements fully contained within the application need to be tested, as well, to ensure that user access
rights have been properly defined. These elements will be verified and validated in this activity.
[Customer Name] Key Users will perform data analysis of the solution, in accordance with the data migration
requirements and data migration strategy. During DAT, the customer not only verifies the data migrated but also validates
that the data may be inquired upon, reported upon, and transacted upon.
The goal of this testing is to validate that all aspects of the Pastel Evolution solution, including all interacting / interfacing
systems and subsystems (e.g. <<legacy system names>>) support the [Customer Name]'s business processes and
produce the expected results. Integration testing will also ensure that the introduction of additional interfaces or security
wont have a negative effect on the previously validated system.

8.2 Test Script Execution (Customer Executed Tests)


Each test script will be carried out in various steps, including preparation, execution, and finalization. Preparation includes
determination of test script objectives, planned outcomes, and related dependencies which have been identified in the
test script.
Test scripts will be executed in the order specified on the test script/scenario listing spreadsheet. This order is critical so
that the team can successfully execute the tests for all of the planned business scenarios. Each test script establishes
data and conditions that will be used in subsequent test scripts/steps.
Each test script has one or more test steps that have been documented in the test script document. Testers will execute
each step and document data and results accordingly. Screen captures will provide graphic evidence of data values
before and after each test so that the team can document the Pastel Evolution solution configuration values that control
the behavior and results generated by the system.
Any errors that occur during testing will be logged and resolved according to the Test Result Tracking procedure
documented below in section 8.3 Test Result Tracking.
Review and approval of test scripts by [Customer Name] management is required in order to consider the test script
complete and accepted.

8.3 Test Result Tracking


Errors that are identified will be documented in the table contained within the test script document and also in the master
Test Result Tracking Report (summary page of the test script spreadsheet). The Test Result Tracking Report table is a
master source for tracking all results that occur during testing.
Issues that arise during execution of test scripts must be logged and tracked until resolved. Each test result that comes
back as a fail should include a description of the problem that occurred and can be categorized via impact and severity.
These classifications form a scheme for prioritizing responses to the issue. Severity 1 and 2 errors must be resolved prior
to Go-Live (see section 4.1.2 regarding acceptance criteria). Responses to severity 3 and 4 problems can be planned
according to availability of resources.
Such issues as mentioned above are documented and logged in the Test Result Incident log. The log will be managed
via the Microsoft SharePoint project portal at the following location:
<<Insert location of Log>>
Resolutions are also recorded on this log. See section Error! Reference source not found. of this document for
an example of the Test Result Incident Log.

11

Information about the error incident is also maintained within the related Test Script documents. Errors are logged when
they occur and testing activities stop until the error has been categorized. Efforts to diagnose and resolve the issue will
depend on the nature and severity of the problem:

Severity 1 - A critical business function is not functioning correctly or is not available. Manual processes or other
alternatives are not possible. Continued Testing of related downstream scripts is not possible, as downstream
business function will also be severely affected.

Severity 2 - A critical business function is not functioning correctly, or is severely impaired. Manual processes or
other alternatives are possible, but may not be practical. Continued Testing of related downstream scripts may be
possible without extending the error to downstream business functions and the failed test script may be resolved
and regression tested independent of the remainder of the other test scripts.

Severity 3 - A non-critical business function is not functioning correctly, or is severely impaired. Manual
processes or other alternatives are possible. Continued Testing of related downstream scripts may be possible
without extending the error to downstream business functions. The failed test script may be resolved and
regression tested independent of the remainder of the other test scripts.

Severity 4 - All other test failures that have minimal impact to the general customer population or that affect
individual users. The solutions to these test script execution failures will be proposed and addressed if time and
schedules permit, or logged for inclusion in future releases.

12

9 Schedules
The following table and chart outline the anticipated schedule for testing activities as it pertains to the master project
schedule. It also will provide insight into durations for completing the activities. Please note that all activities have at least
a single concurrence with another testing activity, with the exception of Conduct User Acceptance Testing. Please see
section 5 of this document for details about responsibilities for developing and executing the following deliverables.
Dates
Testing Activity

Start Date

Target Completion
Date

Critical Due Date

Develop Process Test Scripts


Develop Data Acceptance Test Scripts
Develop Integration Test Scripts
Conduct Unit and Feature Testing
Develop Performance Scripts
Develop User Acceptance Scripts
Conduct Process Testing
Conduct Data Acceptance Testing
Conduct Integration Testing
Conduct Performance Testing
Conduct User Acceptance Testing
Table 6: Testing Activities and Schedules
Nov 2008

ID

Task Name

Dec 2008

Jan 2009

Feb 2009

Mar 2009

Duration
9/11 16/11 23/11 30/11 7/12 14/12 21/12 28/12

Develop Process Test Scripts

6.6w

Develop Data Acceptance Test Scripts

6.6w

Develop Integration Test Scripts

9.6w

Conduct Unit and Feature Testing

5w

Develop Performance Test Scripts

8.8w

Develop User Acceptance Test Scripts

11w

Conduct Process Testing

1w

Conduct Data Acceptance and Integration


Testing

Conduct Performance Testing

2w

10

Conduct User Acceptance Testing

3w

4/1

11/1

18/1

25/1

1.8w

Figure 1 : Testing Activities Schedule Durations

13

1/2

8/2

15/2

22/2

1/3

8/3

15/3

Das könnte Ihnen auch gefallen