Beruflich Dokumente
Kultur Dokumente
Prepared for
[Customer Name]
Project
[Project Name]
Prepared by
[Author]
Contributors
[Document contributors]
1 Table of Contents
1
OBJECTIVES................................................................................................................................................................. 3
SCHEDULES ............................................................................................................................................................... 13
3 Objectives
The overall objective of the testing activities of the [Project Name] project is to validate that the Pastel Evolution solution
configuration, custom developed features, and developed integrations function as specified by [Customer Name]. Specific
business scenarios must be selected and the data that relates to these must be chosen carefully. Execution of test
scripts, and the steps that comprise them, must be planned so that the selected business scenarios can be
demonstrated.
The scope of testing determines the activities that must occur. Each activity and step has formal entrance and exit
criteria.
4 Testing Approach
4.1 Overview
Testing activities extend from the implementation and deployment of the Pastel Evolution solution through to the Go-Live
Phase. Activities for creating, updating or executing test scripts are discussed in the Design & Build phase. The Testing
activities in the Design & Build phase include:
The testing activities in the Design & Build Phase include the following tests:
Feature Testing Stand-alone testing of the system configuration, performed during the Configuration process
by the application consultants.
Sub-Process Testing Testing of related features that make up a defined business process, performed during
Configuration by the Customer and the Application Consultants.
Unit Testing Stand-alone testing of the system modification (custom code), performed during Design & Build,
by the developers.
3
Function Testing Stand-alone testing of the system modification (custom code), performed during Design &
Build by the application consultants.
Process Testing Complete testing of related features and functions that make up a defined business process,
performed during Design & Build by [Customer Name].
Data Acceptance Testing (DAT) Testing performed by Data Owners and Key Users in the Design & Build
phase prior to or during integration testing. During DAT, [Customer Name] not only verifies the data migrated but
also validates that the data may be inquired upon, reported upon, and transacted upon.
Integration Testing Integrated testing of business processes performed by the Key Users prior to system signoff. This form of testing focuses on end-to-end business processes including development, interfaces, reports,
and integrations to external systems.
o
o
o
o
Testing during the Design & Build phase involves the following processes:
Validating that the configured system under load will meet the performance metrics
Confirming that the overall setup and configuration of the system meets the customers business
requirements
Pre-approval for User Acceptance sign-off
Performance Testing will test business processes and integration. This process will focus on high-transaction
volume that is anticipated during peak times. This process will help validate that the system performance meets
the business requirements.
User Acceptance Testing (UAT) will be the final testing performed by the Key Users prior to system sign-off.
The End Users selected to perform the UAT must receive appropriate training prior to the start of the UAT.
4.1.1
Delivery and review of finalized [Customer Name] prepared test scripts prior to the associated testing activity (i.e.
process test scripts must be finalized before process testing can be executed).
Testing Activity
Requisite Pre-Conditions
4.1.2
Acceptance Criteria
All Test Strings and Test Steps have been executed successfully, or reasons for non-execution documented and
approved.
All severity 1 (severity is defined in section 8.3) failed tests reported in the Test Scenario Summary Reports have
been resolved, tested successfully and closed.
All severity 2 failed tests reported in the Test Scenario Summary Reports are closed or have an action plan in
place for closure prior to the closure of Integration testing.
All severity 3 and 4 failed tests reported in the Test Scenario Summary Reports are closed or have an action plan
in place for closure prior to the Go-Live Date or within a period of time approved by [Customer Name].
4.2 Assumptions
[Partner Organization] will conduct Unit and Feature testing. [Partner Organization] and [Customer Name] will
conduct Sub-process and Function testing.
[Customer Name] shall create, document and finalize under guidance from [Partner Organization] the UAT
scripts during the design phase of the project. The UAT Test Scripts shall conform to the scope within the System
Blueprint. The conformance will be tracked with traceability matrices between the Blueprint and UAT Test Scripts.
Any changes to the UAT scripts shall follow the Change Management process.
It is essential that [Customer Name] retains ownership of all deliverable testing activities. Where required,
[Partner Organization] will provide a resource to assist in executing the tests.
[Customer Name] shall deliver the UAT Test Scripts/Cases before the end of the Design & Build phase. Any
delays in delivering the UAT Test Cases/Scripts may adversely impact the timeline, cost and quality of the
delivered solution.
[Customer Name] shall be responsible for organizing and executing the user acceptance tests, and logging
defects in defect tracking logs identified by [Partner Organization] and [Customer Name].
A Microsoft Office Excel spreadsheet can be used to document test scripts for future execution.Optionally, an
accompanying Microsoft Office Word document can be constructed which includes screenshots to help testers execute
the script steps and an area for testers to paste screenshots of test results. The table below lists the test script
development activities and the associated responsibilities for [Partner Organization] and [Customer Name].
Responsibility
Error! Unknown
document property
[Partner
name.
Organization]
C, I
R, A, C, I
C, I
R, A, C, I
C, I
R, A, C, I
C, I
R, A, C, I
[Partner
Organization]
[Customer Name]
R, A, C, I
C, I
C, I
R, A, C, I
C, I
R, A, C, I
C, I
R, A, C, I
R, A, C, I
R, C, I
6 Testing Resources
6.1 Resource Requirements
6.1.1
6.1.1.1
Environmental
The TEST environment will be used for the execution of process, integration, data acceptance, performance and user
acceptance testing. The TEST environment will be virtualized and require a test instance of the Pastel Evolution solution
and integrated legacy system(s).
As per section 6.1.3.1, the Performance Testing Toolkit for the Pastel Evolution solution will need to be installed in the
TEST environment prior to conducting performance tests. In order to install the performance toolkit, [Partner
Organization] will be responsible for setting up the prerequisite software (e.g. Microsoft Visual Studio 2005 Team
System) to support this tool.
6.1.1.2
Data
Test data for <<specify record data>> will be provided via the integration with <<legacy system(s)>> as per the data
migration strategy. [Customer Name] will be responsible for having all required data cleansed, transformed, and entered
into <<legacy system(s)>> prior to process testing. [Customer Name] will be responsible to provide test data for test script
7
execution prior to commencement of process, integration, data acceptance, performance, and user acceptance testing
(see also section 5 ).
6.1.1.3
Documentation
Test scripts (see section 5.1) will be the key form of documentation to aid the execution of testing the developed The
Pastel Evolution solution system. Standard training materials can be referenced to assist users during the execution of
Process and Integration tests, while [Customer Name] custom developed training documentation can be leveraged during
User Acceptance Testing.
All test results will be documented as per the process indicated in section 8.3.
6.1.1.4
The provision of sufficient client environments for accessing the TEST environment will be the responsibility of [Customer
Name]. It is required that [Customer Name] set up a minimum of [##] online Outlook client environments, one offline
(laptop) Outlook client environment to adequately execute testing. The remainder of users can leverage the web client for
conducting the applicable testing cycle.
[Customer Name] will need to provide adequate room facilities to accommodate the various testing activities.
6.1.2
The following table outlines the role and expected resource count from [Customer Name] required to execute the listed
testing activities.
Testing Activity
Role
Key User
Count
6
Required Training
Core Team Training
Key User
Key User
Key User
Key User
Key User
Key User
Train-the-Trainer Training
Key User
Train-the-Trainer Training
6.1.3
6.1.3.1
Testing Tools
Performance Testing
Performance/Load Testing is best done using appropriate platform specific tools that place significant stress upon each
function of the system. Using tools to automate performance testing helps ensure repeatability, reduces errors, and
removes the otherwise heavy resource requirements to execute this form of testing. With the Pastel Evolution solution,
the <<input performance testing tool>> can be used to formalize the performance testing of the Pastel Evolution solution.
Using the provided tools, the workload is identified and the workload scripts are developed. These scripts are run by the
load testing framework of Visual Studio and present the System Under Test (SUT) with a stream of requests that mimic
what the Pastel Evolution solution would encounter during typical business operation. Additionally, both the number of
8
simulated users and the rate at which those users perform their actions can be controlled to determine sizing and
scalability limits for a given system and workload.
Required Tools
Location
<<Performance Testing
Tool>>
- TEST Environment
7 Testing Scope
The following items are in scope for the testing cross-phase activity as it pertains to the approach, assumptions and
responsibilities outline earlier in this document:
Testing of end-to-end Business Scenarios identified as Test strings of a culmination of test scripts.
Testing of the Pastel Evolution solution configured views and interface changes.
Testing of custom developed integrations with legacy systems by the project delivery team.
Testing of the Pastel Evolution solution system performance by the project team, as it pertains to business
processes.
Feature Testing
Stand-alone testing of the configured feature should take place in the TEST Environment. The goal of this testing is to
validate that the Pastel Evolution solution configuration and sample customer data meets [Customer Name]s business
process requirements. During this testing, the application consultant(s) should test all data validation aspects of the
feature, as well as any functionality contained wholly within the feature. It is also preferred that Key Users be participant
in executing this testing.
The end result of this testing is to ensure that the configured features have fully been tested with a degree of confidence
that any subsequent issues are the result of the features interaction with other components of the environment, and not
with the feature itself. It should be noted that in some cases stand-alone feature testing is not feasible due to the design
of the feature. In this case, the feature should be introduced into the Integration testing process and feature testing should
be performed at that time.
8.1.2
Unit Testing
The Unit Testing process starts with the [Partner Organization] developer conducting Unit Tests of the custom code to
detect and resolve any issues. Since the testing is being conducted by the developer, any anomalies or issues
discovered during this testing will be resolved without referring the feature to a previous step if possible.
In the Pastel Evolution solution, there are a number of different procedures to make developing and testing scripts easier.
Some of those procedures are listed below:
9
Preview The preview feature in the Entity Form Customization page can be used to test the code for the
OnLoad, OnSave, and OnChange while developing it. Code should include conditional statements to test for all
the possible FormType properties. Also, the code always should be tested in the application after publishing the
entity customizations to confirm that it similarly behaves as it did in the preview (see function testing details
below).
Alerts The Jscript alert() method can be used to test values while developing code, but comments on these
alerts need to be removed before finishing.
Script Editor The Microsoft CRM Event Details Properties windows should not be used to write the code. It is
better to use an external Script editor (such as Microsoft Visual Studio, Microsoft FrontPage, or Notepad) and
paste the scripts into the window.
Reference External Scripts The OnLoad event enables the ability to inject an HTML<script> element to the
head of an HTML document. This allows the developer to define functions in a separate Jscript file and load that
script with the page when the form loads. Because these functions are included when the form is loaded, those
functions can be called from any form or field event for that entity. By taking this approach, the amount of Jscript
that is put into the Event Detail Properties window can be reduced. The entity does not need to be republished for
any changes defined in the functions to take effect; the Jscript simply needs to be saved.
8.1.3
Function Testing
Following Unit Testing, the developer and potentially the functional consultant(s), conducts Function Testing of the code
in a TEST environment. Prior to beginning this testing, it is recommended that the testers review the functional
requirements document relative to the function being tested to ensure they have an understanding of the required
functionality.
8.1.4
Since all possible variants of key design requirements will consume significant amounts of time [Customer Name] will
concentrate on a representative smaller subset of all possible variants, based on the primary Pastel Evolution solution
design requirements and the more common processes with the highest business volumes.
Sub-process testing is conducted to validate that, for the corresponding business process, within a larger process
framework, both the configuration of the Pastel Evolution solution and the custom code development meet [Customer
Name]s business process requirements. Sub-process testing may be required to test individual processes within a larger
process prior to executing the holistic process test.
Process Testing is the complete testing of related features and functions that make up a defined business process,
performed during Design & Build by [Customer Name]. Process testing validates that both the configuration of the Pastel
Evolution solution and the custom code development, for the corresponding business process, meet business process
requirements. An example of a Process Testing scenario would be one that covers the Create a Case workflow, or one
that validates the functionality with a specific Pastel Evolution solution entity. While it is imperative that Process Testing
verify the functionality of all aspects of the solution being developed, it is not meant to be a system performance testing
session. The Key Users and Subject Matter Experts, if necessary, will identify the Process Testing scenarios based on
their To Be process flows and make any changes after reviewing them. The scenarios should identify all functionality
required to support their key Business processes. The level of detail in the Process scenarios may very as required.
The business process scenarios will require the Key User to validate data and validate expected results. All steps must
be correct in order for the process script to pass. The Test script will require the use of corresponding data that is
required for the testing. [Customer Name] will create, update and validate data to ensure adequate support of test script
requirements. The criteria established in this activity of the Design phase are the basis for testing results comparison from
tests executed in the Design & Build phase.
8.1.5
DAT Testing
[Customer Name] Key Users will perform data analysis of the solution, in accordance with the data migration
requirements and data migration strategy. During DAT, the customer not only verifies the data migrated but also validates
that the data may be inquired upon, reported upon, and transacted upon.
10
8.1.6
Integration Testing
[Customer Name] Key Users will execute Interfaces and Integration Test Scripts for end-to-end business processes
testing. An example of this test would include a script that ties together the complete customer service contact
management and request management processes.
This testing is conducted with the application security turned on, and is conducted in the TEST Environment. Each aspect
of feature security needs to be tested to ensure there are no issues, and to validate that the approved security design
was implemented correctly. In addition to the stringent testing requirements needed for custom features, standard
features or enhancements fully contained within the application need to be tested, as well, to ensure that user access
rights have been properly defined. These elements will be verified and validated in this activity.
[Customer Name] Key Users will perform data analysis of the solution, in accordance with the data migration
requirements and data migration strategy. During DAT, the customer not only verifies the data migrated but also validates
that the data may be inquired upon, reported upon, and transacted upon.
The goal of this testing is to validate that all aspects of the Pastel Evolution solution, including all interacting / interfacing
systems and subsystems (e.g. <<legacy system names>>) support the [Customer Name]'s business processes and
produce the expected results. Integration testing will also ensure that the introduction of additional interfaces or security
wont have a negative effect on the previously validated system.
11
Information about the error incident is also maintained within the related Test Script documents. Errors are logged when
they occur and testing activities stop until the error has been categorized. Efforts to diagnose and resolve the issue will
depend on the nature and severity of the problem:
Severity 1 - A critical business function is not functioning correctly or is not available. Manual processes or other
alternatives are not possible. Continued Testing of related downstream scripts is not possible, as downstream
business function will also be severely affected.
Severity 2 - A critical business function is not functioning correctly, or is severely impaired. Manual processes or
other alternatives are possible, but may not be practical. Continued Testing of related downstream scripts may be
possible without extending the error to downstream business functions and the failed test script may be resolved
and regression tested independent of the remainder of the other test scripts.
Severity 3 - A non-critical business function is not functioning correctly, or is severely impaired. Manual
processes or other alternatives are possible. Continued Testing of related downstream scripts may be possible
without extending the error to downstream business functions. The failed test script may be resolved and
regression tested independent of the remainder of the other test scripts.
Severity 4 - All other test failures that have minimal impact to the general customer population or that affect
individual users. The solutions to these test script execution failures will be proposed and addressed if time and
schedules permit, or logged for inclusion in future releases.
12
9 Schedules
The following table and chart outline the anticipated schedule for testing activities as it pertains to the master project
schedule. It also will provide insight into durations for completing the activities. Please note that all activities have at least
a single concurrence with another testing activity, with the exception of Conduct User Acceptance Testing. Please see
section 5 of this document for details about responsibilities for developing and executing the following deliverables.
Dates
Testing Activity
Start Date
Target Completion
Date
ID
Task Name
Dec 2008
Jan 2009
Feb 2009
Mar 2009
Duration
9/11 16/11 23/11 30/11 7/12 14/12 21/12 28/12
6.6w
6.6w
9.6w
5w
8.8w
11w
1w
2w
10
3w
4/1
11/1
18/1
25/1
1.8w
13
1/2
8/2
15/2
22/2
1/3
8/3
15/3