You are on page 1of 29

HIGH LEVEL FUNCTIONAL TEST

PLAN
TEST PLAN ACCELERATOR

PROJECT IDENTIFICATION
Project Name

CPI/Project Number

Customer Name

Customer Number

Project Type
(CBI, Implementation, CSS, Upgrade, Internal, other)

Planned Start/Finish

SAP Customer Partner

Project Sponsor

Program Manager

SAP Project Manager

Customer Project Manager

Partner Project Manager

Table of Contents
Table of Contents...............................................................................................................................2
1.
Introduction..............................................................................................................................4
1.1. Purpose of Document...............................................................................................................4
1.2. Definition of Testing..................................................................................................................4
1.3. Objectives.................................................................................................................................4
1.4. Testing Scope...........................................................................................................................4
1.5. Realization Phase.....................................................................................................................5
1.6. Assumptions.............................................................................................................................7
2.
Unit Testing Baseline Configuration....................................................................7
2.1. Description................................................................................................................................7
2.2. Test Management.....................................................................................................................8
2.3. Test Documentation (If Automated Tools were not used).........................................................8
2.3.1 Test Documentation (If Automated Tools were used)..............................................................8
2.4. Test Data...................................................................................................................................8
2.5. Test System..............................................................................................................................8
2.6. Roles & Responsibilities...........................................................................................................9
2.7. Entrance Criteria.......................................................................................................................9
2.8. Exit Criteria...............................................................................................................................9
3.
Business Process (String) Testing.........................................................................10
3.1. Description..............................................................................................................................10
3.2. Test Management...................................................................................................................10
3.3. Test Documentation................................................................................................................10
3.3.1 Test Documentation (If Automated Tools were used)............................................................11
3.4. Test Data.................................................................................................................................11
3.5. Test System............................................................................................................................12
3.6. Roles & Responsibilities.........................................................................................................12
3.7. Entrance Criteria.....................................................................................................................12
3.8. Exit Criteria.............................................................................................................................13
4.
Unit Testing Final Configuration..........................................................................13
4.1. Description..............................................................................................................................13
4.2. Test Management...................................................................................................................13
4.3. Test Documentation................................................................................................................14
4.4. Test Data.................................................................................................................................14
4.5. Test System............................................................................................................................14
4.6. Roles & Responsibilities.........................................................................................................14
4.7. Entrance Criteria.....................................................................................................................14
4.8. Exit Criteria.............................................................................................................................15
5.
Scenario (Integration) Testing.................................................................................15
5.1. Description..............................................................................................................................15
5.2. Test Management...................................................................................................................16
5.3. Test Documentation................................................................................................................17

5.4.
5.5.
5.6.
5.7.
5.8.
5.9.
6.
6.1.
6.2.
6.3.
6.4.
6.5.
6.6.
6.7.
6.8.
7.
7.1.
7.2.
7.3.
7.4.
7.5.
7.6.
7.7.
7.8.
7.9.
7.10.
8.
9.

Test Data.................................................................................................................................17
Test System............................................................................................................................18
Roles & Responsibilities.........................................................................................................18
Entrance Criteria.....................................................................................................................19
Exit Criteria.............................................................................................................................19
Approval of Test Results.........................................................................................................20
User Acceptance Testing...........................................................................................20
Description..............................................................................................................................20
Test Management...................................................................................................................20
Test Documentation................................................................................................................20
Test Data.................................................................................................................................20
Test System............................................................................................................................21
Roles & Responsibilities.........................................................................................................21
Entrance Criteria.....................................................................................................................21
Exit Criteria.............................................................................................................................22
Technical System Testing..........................................................................................22
Description..............................................................................................................................22
Technical Functionality Tests..................................................................................................22
Performance Testing...............................................................................................................24
Test Management...................................................................................................................24
Test Documentation................................................................................................................25
Test Data.................................................................................................................................25
Test System............................................................................................................................25
Roles & Responsibilities.........................................................................................................25
Entrance Criteria.....................................................................................................................25
Exit Criteria............................................................................................................................26
Defect Management.......................................................................................................26
Testing Roles & Responsibilities............................................................................30

1.

Introduction

1.1.

Purpose of Document

The purpose of this document is to describe the test plan for AIM COMPANY project being conducted at AIM
COMPANY. This document will describe the types of testing and related activities that will occur during the Realization
Phase of the project.
1.
2.
3.
4.
5.

Unit Testing
Business Process (String) Testing
Scenario (Integration) Testing
User Acceptance Testing
Performance Testing

Note: This document should be reviewed and updated by project leadership based on scope, schedule, and cost
requirements of the project.

1.2.

Definition of Testing

Testing is an activity aimed at evaluating an attribute or capability of a program or system in determining that it meets
the business requirements as defined in the business blueprint.

1.3.

Objectives

The objectives of testing are:


Ensure that the system meets all the business requirements determined to be in scope
Ensure that the system meets technical requirements and meets service levels for application response time,
throughput, and infrastructure performance at typical production loads

1.4.

Testing Scope

The scope of testing includes business and technical requirements defined in the approved Business Blueprint
document. The Business Blueprint requirements include the following elements: NOTE: UPDATE Testing Scope
based on final project scope.
Enterprise Organizational & Geographic Scope
Master Data Scope
SAP Solution Configuration
o Business Scenario
o Business Processes
o Process Steps
o Transactions
Manual and automated testing of SAP Solution - Configuration
Custom Development Objects (RICEFW):
o Reports
o Interfaces
o Conversions
o Enhancements
o Forms
o Workflow
Bolt-on Software
Technical Integration with Legacy Systems and Applications
Security Roles & Profiles by job/position
Performance Testing

1.5.

Realization Phase

The AIM COMPANY PROJECT will conduct testing in multiple iterations that starts with unit testing and continues with
integration testing of business scenarios, data, and security, and ends with user acceptance testing. Testing iterations
should be completed sequentially, allowing for testing of new configuration and development, as well as those tests
completed during previous cycles.

Figure 1-1, Testing Work Packages for SAP ACTIVATE Methodology

Testing is designed to validate business requirements to provide traceability to the Business Blueprint requirements to
build the SAP solution. The following illustration represents the dependencies between the types of testing that will be
executed during the Realization Phase of this project.

Figure 1-2, Dependencies between Test Cycles

As part of testing, it is important to simulate daily, weekly, and monthly business events and activities (e.g. daily batch
processes, generation of key reports, and execution of financial monthly close) during business process (string) testing

and scenario (integration) testing in the quality assurance environment. Additionally, the project team will perform
mock builds to develop and practice cutover activities during business process (string) testing and scenario
(integration) testing iterations using the QA environment to prepare and simulate production build of the SAP Solution.

Figure 1-3, Testing Business Activities and Practicing Cutover Build in the QA Environment

1.6.

Assumptions

The following assumptions were made during the development of the document.
Business Blueprint requirements acceptance and approval is completed before Realization Phase begins
Unit Testing is complete before Integration Testing commences
Project Management Plans are in place before Integration Testing commences governs changes to the
system, design decisions, documentation, etc. Specifically:
o Scope Management Plan
o Integration Management Plan, procedures related to
o Integration Change Control Process
o Issue Management Process
o Risk Management Plan
Integration Testing conducted in iterations with the progressive addition of security & data conversions

2.

Unit Testing Baseline Configuration

2.1.

Description

Unit Testing validates that individual functions are configured and/or developed to appropriately translate technical and
functional requirements. This would include testing of individual configuration transactions, development objects, and
manual business process steps associated with business transactions.

Testing of configurable transactions


o Test ERP transactions and related business process steps to achieve a defined business outcome with
a module.

o Test Non-ERP transactions or step as part of the configurable transaction


o Test manual transactions or steps performed as part of the configurable transaction
Testing of development objects. These tests will further include:
o Testing of the code within the development object. These tests will be based on the Technical
Specifications documents. This activity will be owned and conducted by the Technical Team.
o Testing the functional aspects of the development object. These tests will be based on the Functional
Specifications documents & Business Process Procedures and will be conducted by the Functional
Teams with the assistance of the Technical Team.

Unit testing is the lowest level of testing where the business transaction or development object is tested and evaluated
for errors. Unit testing is the first test that is completed during configuration and is focused towards the programs inner
functions, rather than towards the integration. Both positive and negative testing should be performed for all critical
functionality.

Positive Testing validates that test functions correctly by inputting a know value that is correct and verifies
that the data/view return is what is expected
Negative Testing validates that the test fails by inputting a value that we know is incorrect and verify that the
component or test case fails. This allows us to understand and identify failures and that the target application
is operating correctly by displaying the appropriate warning message.

Unit testing begins during baseline configuration (test cycle/iteration 1) the project team will work on configuring
transactions and related business process steps that can be configured in the SAP Solution without customer
development objects (e.g., programming or enhancements). Test cases developed during the Blueprint Phase should
be used to unit test configurable transactions and related business process steps of a SAP module. As development
objects are completed, unit testing will be performed to ensure functional specifications are met and integration of
custom development objects with configurable transactions and process steps operate successfully.

2.2.

Test Management

Because baseline configuration unit testing concentrates on testing single business transactions; there is normally no
need to develop detailed test documentation. For simple transactions, testing will be done straightforward during
configuration. However, some configuration transactions, development objects, and business process steps
associated with business transactions are very complex involving multiple screens, functions and variations to run.
These types of complex transactions (i.e. Sales order) will be documented and tested using a test case. Additionally,
test data and test case information documented within the functional specification of customer development objects will
be used to review code of technical objects and test functional aspects of the development object.
Unit testing will be performed manually for configurable transactions and development objects using test cases.
Solution Manager and SAP Quality Center by HP will be used to provide a detailed listing of configurable transactions,
custom development objects, and business process steps that will be tested during unit testing for baseline
configuration. Unit test status will be recorded in Solution Manager and SAP Quality Center by HP and be reflected in
the project schedule. Unit test status reporting will be included in the weekly project status report.

2.3.

Test Documentation (If Automated Tools were not used)

A test case will be used to execute manual testing. Manual testing process using a test case:

Develop test case using test case template


o Test case contains the detail steps, step-by-step, and criteria for completing a test (functional and
performance) to support manual testing.
After the creation of the test case, the actual testing is performed manually by a project team member
The results of each test case is recorded manually
No test automation tools are being used
Test management tools can be deployed for test administration and test organization
Test defects are tracked and monitored manually.

The AIM COMPANY functional lead and/or technical lead will review and approve all test cases.

2.3.1

Test Documentation (If Automated Tools were used)

A manual test case will be used to build & execute an automated test case. Automated testing process using a
manual test case:

2.4.

Develop test script/case using manual test case/BPP


After the creation of the test script/case, the actual testing is performed by a Test Engineer/Tester
The results of each test case is recorded & stored manually
Automation tools used Ex: Tools like QTP, TAO and Load Runner are used to create the scripts.
Test defects are tracked and monitored in test management tools like QC (if Applicable) automatically.

Test Data

Fabricated or customer specific master data will be manually entered as required for unit testing and used by other
teams where appropriate.

2.5.

Test System

Unit testing will be performed in the project development environment (DEV) with manually entered test data to execute
test cases.

2.6.

Roles & Responsibilities

The SAP consultant and AIM COMPANY business system analyst will conduct the unit testing of configuration and
execute test cases according to unit test plan. The developer along with the consultant and/or the business system
analyst will conduct the unit testing of development objects (if available).

Business system analyst will identify and input master data for unit testing

SAP Consultant will resolve configuration issues

Developer Team Members will resolve any issues with development objects

Security & Administration Team Member will resolve Security-related issues

Team Leads will ensures the creation, completeness, quality of unit test cases and testing, managing test
resources, and sign off on test results.

Project Manager & Integration Team Lead is responsible for the overall planning, tracking and reporting of
Unit Testing status and results.

2.7.

Entrance Criteria

Business Requirements are finalized and frozen


Business Blueprint Requirements are documented and approved.
Business requirements document and custom development object listing is finalized and frozen
Development Test Environment is built, accessible, and ready for unit test
Project team security profiles and roles are in evoked.
Configuration documentation requirements and transport procedure is defined and communicated to the
project team

Testing tools are installed and deployed.


The testing tools to be used for Unit testing have been installed, deployed, and ready for use.
Project team members have been trained on the use of testing tools.
Unit test plan is loaded in Solution Manager and SAP Quality Center.
Detailed inventory of business transactions, business process steps, and custom development objects (if
available) to be unit tested
o Test data
o Test work packages
Completed unit test schedule

2.8.

Exit Criteria

Unit test cases executed and defects resolved.


All unit test cases have been documented as required.
Unit test cases have been executed.
Unit test cases and test results have been reviewed and approved by functional leads and/or technical leads.
All significant defects (business-critical or high integration impact) have been resolved and retested.
Unit test results reported and approved
Weekly unit test results report - prepared, reviewed, and approved
Documentation updated for changes made during Unit Testing.
Any documentation updates as a result of changes made during the unit testing have been completed and
approved.

3.

Business Process (String) Testing

3.1.

Description

Business Process (String) Testing validates that full operability of interconnected functions, methods or objects within
the functional areas of a SAP Solution (e.g., Sales).

Includes a set of logically related activities or business process steps to achieve a defined business processes.
Includes business processes that cross functional areas (e.g., Sales and Finance).

Business process (string) testing is not meant to be a full blown integration test but rather a more comprehensive test
within a module or some limited testing between some modules as needed (e.g. create an order, deliver it, bill it and
apply cash application). Test cases are updated as part of baseline configuration and used to automate or manually
perform business process (string) testing of custom development objects, configurable transactions and related
business process steps for the SAP Solution that have been unit tested.
Subsequent Integration testing will focus on a more full blown testing to ensure the proper functioning of cross module
business process flows.
Business process procedures (BPP) are developed in the QA environment during business process (string) string
testing and continue throughout scenario (integration) testing. BPPs will be utilized during end user acceptance testing
and used to assist with end user training.

3.2.

Test Management

The project teams will use manual and automated testing techniques to perform business process (string) testing.
Test cases will continue to be used to perform manual tests; however Solution Manager, SAP Quality Center by HP and
Testing Acceleration and Optimization will be used to automate test cases into test scripts.

Unit testing will continued to be performed manually for new and/or changed configuration and development objects
using test cases. Solution Manager and SAP Quality Center by HP will be used to provide a detailed listing of
configurable transactions, custom development objects, business process steps, and business processes that will be
tested during business process (string) testing. Test case and/or test script status will be recorded in Solution Manager
and SAP Quality Center and be reflected in the project schedule. Business process (string) testing status reporting will
be included in the weekly project status report.
BPP Tool - Productivity Pak formerly Infopak will be used to create business process procedures (BPP) in the QA
environment.

3.3.

Test Documentation

A test case will be used to execute manual testing. Manual testing process using a test case:

3.3.1

Develop test case using test case template


o Test case contains the detail steps, step-by-step, and criteria for completing a test (functional and
performance) to support manual testing.
After the creation of the test case, the actual testing is performed manually by a project team member
The results of each test case is recorded manually
No test automation tools are being used
Test management tools can be deployed for test administration and test organization
Test defects are tracked and monitored manually.

Test Documentation (If Automated Tools were used)

A manual test case will be used to build & execute an automated test case. Automated testing process using a
manual test case:

Develop test script/case using manual test case/BPP


After the creation of the test script/case, the actual testing is performed by a Test Engineer/Tester
The results of each test case is recorded & stored manually
Automation tools used Ex: Tools like QTP, TAO and Load Runner are used to create the scripts.
Test defects are tracked and monitored in test management tools like QC (if Applicable) automatically.

The AIM COMPANY functional lead and/or technical lead will review and approve all test cases.
Test toolsets will be used to automate test scripts that can be reused for regression testing at a future date. In
automatic tests, after the creation of the test cases, a test script is created using test automation tools to perform
testing activities. Automated testing process using a test script:

Approved test case is successfully executed


Test case is used to record automated test script
o A test script is an automated test case or automated test scenario.
After the creation of the test script, the actual testing is performed by automated test tools
After the test, the result of each test script is recorded automatically by the test tool
Test scripts can be reused for regression testing for support packages, upgrades, etc.
Methodical approach to reduce costs and safeguard the software lifecycle process
Test defects are tracked and monitored within the testing tool

10

3.4.

Test Data

The majority of the test data will be created manually or using a extended computer aided test tool (eCATT) for simple
and low-volume data conversion activities to load test data to execute test cases and test scripts, which will be used by
other teams where appropriate.
Manual Entry - Appropriate for low record count and ability to easily manage data entry accuracy by qualified project
team members. Source of data must be document to enable test case steps to compete data entry. This type of data
migration is utilized by functional teams to perform unit testing, business process (string) testing, and integration testing
cycles.
Extended Computer Aided Test Tool (eCATT) automated tool to be used to support simple and low-volume data
conversion activities. Transaction script are created via the transaction recorder function and then played back with
internal or external data in normal dialog operation. This type of data conversion is utilized by functional teams to
perform business process (string) testing, and integration testing cycles.

3.5.

Test System

Business process (string) testing is performed in the project quality assurance (QA) environment. NOTE: If a QA
environment is not available, consider partitioning the DEV environment with a QA like environment for test
automation.

3.6.

Roles & Responsibilities

The SAP consultant and AIM COMPANY business system analyst will conduct the unit testing of new configuration and
execute test cases and test scripts according to business process (string) test plan. The developer along with the
consultant and/or the business system analyst will conduct the unit testing of new development objects (if available)
and integrate previously unit tested development objects into business process (string) testing.

Business system analyst will identify and input master data for unit testing

SAP Consultant will resolve configuration issues and assist with eCATT automation of master data conversion
activities

Developer Team Members will resolve any issues with development objects

Security & Administration Team Member will resolve Security-related issues

Team Leads will ensures the creation, completeness, quality of unit test cases, test scripts, and testing,
managing test resources, and sign off on test results.

Project Manager & Integration Team Lead is responsible for the overall planning, tracking and reporting of
testing cycle status and results.

3.7.

Entrance Criteria

Unit testing for baseline is completed


Documentation updates as a result of changes made during the unit testing have been completed and
approved
Quality assurance (QA) environment is built, accessible, and ready for unit test
Project team security profiles and roles are in evoked.
Testing tools are installed and deployed.
The testing tools to be used for testing have been installed, deployed, and ready for use.
Project team members have been trained on the use of testing tools.

11

Business process (string) test plan is loaded in Solution Manager and SAP Quality Center.
Detailed inventory of business transactions, business process steps, custom development objects (if available),
and business processes to be tested
o Test data
o Test work packages
Completed business process (string) test schedule
Dedicated test lab or area is established to perform business process (string) testing on standard AIM COMPANY
issued desktops and laptops
Utilize security profile and roles for business process (string) testing
Schedule daily testing status meetings with testers to review daily test plan, dependencies, successor tests,
and outcome.

3.8.

Exit Criteria

Business process (string) testing test cases and test scripts have been executed and defects resolved.
All test cases have been documented and test scripts have been recorded as required.
All test cases and test scripts have been executed.
Test case, test script, and test results have been reviewed and approved by functional leads and/or technical
leads.
All significant defects (business-critical or high integration impact) have been resolved and retested.
Business process (string) test results reported and approved
Weekly test results report - prepared, reviewed, and approved
Documentation updated for changes made during business process (string) testing.
Any documentation updates as a result of changes made during the testing have been completed and
approved.

4.

Unit Testing Final Configuration

4.1.

Description

During final configuration the project team will continue to work on configuring transactions and related business
process steps that can be configured in the SAP Solution with customer development objects (e.g., programming or
enhancements). Test cases are updated as part of baseline configuration, business process (string) testing and should
be used to unit test new and updated, custom development objects, configurable transactions and related process
steps for the SAP Solution.
Final configuration tests business processes within the functional areas of the solution to verify that the configuration
and development is valid and that the configuration supports the business processes defined in the business blueprint.
At the end of final configuration, steps should be taken to freeze configuration and institute the change control process
for all configuration changes and transports within the SAP solution as a result of subsequent integration testing
defects.
Refer to the Unit Testing Baseline Configuration description section for further information regarding unit testing.

4.2.

Test Management

Because baseline configuration unit testing concentrates on testing single business transactions; there is normally no
need to develop detailed test documentation. For simple transactions, testing will be done straightforward during
configuration. However, some configuration transactions, development objects, and business process steps

12

associated with business transactions are very complex involving multiple screens, functions and variations to run.
These types of complex transactions (i.e. Sales order) will be documented and tested using a test case. Additionally,
test data and test case information documented within the functional specification of customer development objects will
be used to review code of technical objects and test functional aspects of the development object.
Unit testing will be performed manually for new and/or updated configurable transactions and development objects
using test cases. Solution Manager and SAP Quality Center by HP will be used to provide a detailed listing of
configurable transactions, custom development objects, and business process steps that will be tested during unit
testing for final configuration. Unit test status will be recorded in Solution Manager and be reflected in the project
schedule. Unit test status reporting will be included in the weekly project status report.

4.3.

Test Documentation

A test case will be used to execute manual testing. Manual testing process using a test case:

Develop test case using test case template


o Test case contains the detail steps, step-by-step, and criteria for completing a test (functional and
performance) to support manual testing.
After the creation of the test case, the actual testing is performed manually by a project team member
The results of each test case is recorded manually
No test automation tools are being used
Test management tools can be deployed for test administration and test organization
Test defects are tracked and monitored manually.

The AIM COMPANY functional lead and/or technical lead will review and approve all test cases.

4.4.

Test Data

Fabricated or customer specific master data will be manually entered as required for unit testing and used by other
teams where appropriate.

4.5.

Test System

Unit testing will be performed in the project development environment (DEV) with manually entered test data to
execute test cases.

4.6.

Roles & Responsibilities

The SAP consultant and AIM COMPANY business system analyst will conduct the unit testing of configuration and
execute test cases according to unit test plan. The developer along with the consultant and/or the business system
analyst will conduct the unit testing of development objects (if available).

Business system analyst will identify and input master data for unit testing

SAP Consultant will resolve configuration issues

Developer Team Members will resolve any issues with development objects

Security & Administration Team Member will resolve Security-related issues

Team Leads will ensures the creation, completeness, quality of unit test cases and testing, managing test
resources, and sign off on test results.

Project Manager & Integration Team Lead is responsible for the overall planning, tracking and reporting of Unit
Testing status and results.

13

4.7.

Entrance Criteria

Business process (string) testing is completed


Documentation updates as a result of changes made during the testing have been completed and approved
Development (DEV) environment is accessible and ready for unit test
Project team security profiles and roles are in evoked.
Configuration documentation requirements and transport procedure are in place and working
Testing tools are installed and deployed.
The testing tools to be used for testing have been installed, deployed, and ready for use.
Unit testing for final configuration test plan is loaded in Solution Manager and SAP Quality Center.
Detailed inventory of business transactions, business process steps, and custom development objects (if
available) to be tested
o Test data
o Test work packages
Completed unit testing test schedule
Dedicated test lab or area is established to perform testing on standard AIM COMPANY issued desktops and laptops
Utilize security profile and roles for testing
Schedule daily testing status meetings with testers to review daily test plan, dependencies, successor tests,
and outcome.

4.8.

Exit Criteria

Unit test cases executed and defects resolved.


All unit test cases have been documented as required.
Unit test cases have been executed.
Unit test cases and test results have been reviewed and approved by functional leads and/or technical leads.
All significant defects (business-critical or high integration impact) have been resolved and retested.
Unit test results reported and approved
Weekly unit test results report - prepared, reviewed, and approved
Documentation updated for changes made during Unit Testing.
Any documentation updates as a result of changes made during the unit testing have been completed and
approved.

5.

Scenario (Integration) Testing

5.1.

Description

Scenario (Integration) Testing validates a set of business processes that define a business scenario in a
comprehensive and self-contained manner on a macro level.

Integration testing is recommended to be done in multiple iterations.


The initial iteration of integration testing concentrates on testing all important business processes within the
SAP components of the implemented solution, starting with touch point scenarios and ending with end-to-end
scenarios.
The final iteration of integration testing focuses on cross-functional business scenarios with non-SAP systems
and applications, custom development objects, converted data, and solution security.

14

Scenario (Integration) testing will be designed at the business scenario level. Since business scenarios are collections
of business process and process steps, Integration Tests will include a collection of unit tests. Integration tests may
also include other transactions including manual transactions, custom transactions, security steps, etc. as defined in
the Business Blueprint. Business process procedures (BPP) are developed in the QA environment during business
process (string) string testing and continue throughout scenario (integration) testing. BPPs will be utilized during end
user acceptance testing and used to assist with end user training.
Determine number of scenario (integration) testing iterations required for your project and consider breaking out
iterations as separate sections, detailing specific requirements for each iteration.
Scenario (Integration) testing is recommended to be done in multiple iterations.

The first iteration of scenario (integration) testing concentrates on testing all important business processes
inside the SAP system, starting with business processes and ending with end-to-end-scenarios across
functional teams. Test cases and test scripts should be updated to incorporate business processes and endto-end business scenarios, which include security profiles and roles, manual data entry and data conversion
testing.

The n iteration of scenario (integration) testing will cover the full range of business scenarios with some
variation and include the business critical, custom-developed objects, and reports. Security roles & profiles will
be used in this iteration. However, security-related defects will not stop progress of testing a particular
business scenario. Data conversions will be in scope to the extent that conversion programs are available and
ready. However, only a portion of the full data will be converted.

Final iteration of scenario (integration) testing is an evolutionary process that will be driven from previous
testing efforts. The final iteration of integration testing focuses on the cross-functional integration points, endto-end business processes, and critical cross-enterprise scenarios with touch points to external components
and legacy applications, including testing of all custom development objects, security profiles and roles,
regression testing of changes to existing production systems, and data conversions. The final iteration of
integration testing is accomplished through the execution of predefined support for business flows, or
scenarios, that emulate how the system will run your business. These business flows, using migrated data from
the preexisting systems, are performed in a multifaceted computing environment comprising SAP software,
third-party software, system interfaces, and various hardware and software components. Security roles and
profiles will be used for all testing activities. Testing of a particular business scenarios will be stopped for any
defects (security or otherwise) encountered. All data conversions will be in scope and an effort will be made to
load all the data that is ultimately to be loaded into the Production environment.

During scenario (integration) testing the project team will perform mock builds to develop and practice cutover activities
during scenario (integration) testing iterations using the QA environment to prepare and simulate production build of the
SAP Solution. The QA environment will be refreshed between the iterations of scenario (integration) testing as
indicated in Figure 1-3, Testing Business Activities and Practicing Cutover Build in the QA Environment.
Security profiles and roles will be used during scenario (integration) testing iterations. This is required for the following
reasons.

Integration Testing seeks to test end-to-end business processes and attempts to simulate real-life business
events. Consequently, it is important to use security profiles to ensure that the hand-offs between departments
that are inherent in business processes, occur seamlessly.

Integration Testing also uses production data that is converted into the test environment. It is important to
restrict access to sensitive data (i.e., salary information, personal information, bank information, etc.)

15

5.2.

Test Management

The project teams will use manual and automated testing techniques to perform scenario (integration) testing. Test
cases will continue to be used to perform manual tests; however Solution Manager, SAP Quality Center by HP and
Testing Acceleration and Optimization will be used to automate test cases into test scripts.
Solution Manager and SAP Quality Center by HP will be used to provide a detailed listing of business scenarios that
include configurable transactions, custom development objects, business process steps, and business processes that
will be tested during integration testing. Test case and/or test script status will be recorded in Solution Manager and
SAP Quality Center by HP and be reflected in the project schedule. Scenario (integration) testing status reporting will
be included in the weekly project status report.

5.3.

Test Documentation

A test case will be used to execute manual testing. Manual testing process using a test case:

Develop test case using test case template


o Test case contains the detail steps, step-by-step, and criteria for completing a test (functional and
performance) to support manual testing.
After the creation of the test case, the actual testing is performed manually by a project team member
The results of each test case is recorded manually
No test automation tools are being used
Test management tools can be deployed for test administration and test organization
Test defects are tracked and monitored manually.

The AIM COMPANY functional lead and/or technical lead will review and approve all test cases.
Test toolsets will be used to automate test scripts that can be reused for regression testing at a future date. In
automatic tests, after the creation of the test cases, a test script is created using test automation tools to perform
testing activities. Automated testing process using a test script:

Approved test case is successfully executed


Test case is used to record automated test script
o A test script is an automated test case or automated test scenario.
After the creation of the test script, the actual testing is performed by automated test tools
After the test, the result of each test script is recorded automatically by the test tool
Test scripts can be reused for regression testing for support packages, upgrades, etc.
Methodical approach to reduce costs and safeguard the software lifecycle process
Test defects are tracked and monitored within the testing tool

5.4.

Test Data

Test data will be derived from the following processes or tools:

Manual Entry - Appropriate for low record count and ability to easily manage data entry accuracy by
qualified project team members. Source of data must be document to enable test case steps to compete
data entry. This type of data migration is utilized by functional teams to perform unit testing, business
process (string) testing, and integration testing cycles.

Extended Computer Aided Test Tool (eCATT) automated tool to be used to support simple and low-volume
data conversion activities. Transaction script are created via the transaction recorder function and then
played back with internal or external data in normal dialog operation. This type of data conversion is
utilized by functional teams to perform business process (string) testing, and integration testing cycles.

16

5.5.

Legacy System Migration Workbench (LSMW) provides a recording function that allows the generation of a
"data migration object" in an entry or change transaction. Functional and technical specifications, mapping
documents, and test scripts are required. This type of data conversion is utilized by functional teams to
perform integration testing cycles/iterations.

Custom development objects developed for data load programs (API) used for data transfer. Functional
and technical specifications, mapping documents, and test scripts are required. This type of data
conversion is utilized by functional teams to perform integration testing cycles/iterations.

Test System

The different environments that would be used through this testing cycle/iteration are described below:

QA Environment As mentioned earlier, the scenario (integration) testing iterations will be conducted in a
controlled QA environment. To ensure that testing is valid, an Integrated Change Control Process will be used
to govern all changes to the system during scenario (integration) testing iterations. Typically, no changes to the
system will be implemented directly in the QA environment. Any changes required (to fix defects or incorporate
approved requirements) will be sourced in the DEV environment and migrated to the QA environment using
Solution Manager after successful unit testing. Also, no new functionality will be introduced into the QA
environment (typically) in the middle of scenario (integration) test iteration. Any exceptions should be approved
by the Project Management.

DEV Environment Any changes required to be made to the system during the scenario (integration) testing
iteration (to fix defects or approved requirements) will be first made in the DEV environment and unit tested
before it is migrated to the QA environment.

The basis/infrastructure team will be responsible for the creation and maintenance of these environments as well as
enforcing necessary controls on the QA environment.

5.6.

Roles & Responsibilities

AIM COMPANY Project Team Members or AIM COMPANY Business System Analysts identify, develop,
and update initial test cases and test scripts to perform tests according to scenario (integration) test plan.

Subject Matter Experts perform all necessary scenario (integration) testing according to integration test plan
using business process procedures to execute test cases and/or test scripts with the assistance of the
business system analyst.

Business system analyst will identify and input master data for testing

SAP Consultant will resolve configuration issues

Developer Team Members will resolve any issues with development objects

Business Users or Subject Matter Experts (SMEs) confirm business process and UI functionality (if
applicable)

Legacy System SMEs support data conversion and migration activities. Validates and confirms data loads
and verifies data accuracy.

Basis/infrastructure team will be responsible for the creation and maintenance of the project environments as
well as enforcing necessary controls on the QA environment.

Security & Administration Team Member will resolve Security-related issues

17

Team Leads will ensure the creation, completeness, quality of test cases/scripts and testing, managing test
resources, and sign off on test results.

Project Manager & Integration Team Lead is responsible for the overall planning, tracking and reporting of
testing status and results.

5.7.

Entrance Criteria

Unit testing for final configuration is completed


Documentation updates as a result of changes made during the testing have been completed and approved
Integrated change control process and procedure is established and in place
Configuration is frozen
Data conversion and migration loads are available and ready for testing
Quality assurance (QA) environment has been refreshed, accessible, and ready for scenario integration testing
Project team security profiles and roles are in evoked.
Testing tools are installed and deployed.
The testing tools to be used for testing have been installed, deployed, and ready for use.
Scenario (Integration) Testing iteration test plan is loaded in Solution Manager
Detailed inventory of business transactions, business process steps, and custom development objects, and
business processes to be tested
o Test data
o Test work packages
Completed Scenario (Integration) Testing iteration test schedule
Dedicated test lab or area is established to perform testing on standard AIM COMPANY issued desktops and laptops
Utilize security profile and roles for testing
Schedule daily testing status meetings with testers to review daily test plan, dependencies, successor tests,
and outcome.

5.8.

Exit Criteria

Scenario (Integration) Testing iteration test cases and/or test scripts have been executed and defects resolved.
All Scenario (Integration) Testing iteration test cases and/or test scripts have been documented as required.
Scenario (Integration) Testing iteration test cases and/or test scripts have been executed.
Scenario (Integration) Testing iteration test cases and/or test scripts results have been reviewed and approved
by functional leads and/or technical leads.
All significant defects (business-critical or high integration impact) have been resolved and retested.
Data conversions successfully converted and completed.
Scenario (Integration) Testing iteration results reported and approved
Weekly Scenario (Integration) Testing iteration results report - prepared, reviewed, and approved
System meets the business application requirements as defined in the Business Blueprint Document

5.9.

Approval of Test Results

Scenario (integration) testing test cases and/or test scripts may not be completely executed with a Passed status
within the timeframe available. Consequently, scenario (integration) testing would be considered complete under the
following circumstances:

18

All scenario (integration) testing test cases and/or test scripts have been executed at least once.
All defects that have been categorized as Show-stoppers (Severity 1 and 2) have been resolved and retested
successfully.

In addition to documentation of test results for each scenario (integration) testing test cases and/or test scripts, an
overall Integration test results report will be produced at the end of each test iteration. This report will include the
following sections.

A list of all the test scripts used for this stage.


A list of all the defects reported during this stage with related status.
A report of the compliance with the Entrance Gates and Exit Gates agreed upon for the scenario (integration)
testing iteration including possible exceptions.
Any unplanned events / decisions that occurred during this testing iteration that would constitute a deviation
from the approved testing strategy and approach or this document.

6.

User Acceptance Testing

6.1.

Description

User Acceptance Testing (UAT) - users test the complete, end-to-end business processes to verifying that the
implemented solution performs the intended functions and satisfies the business requirements.
UAT is the last test cycle of a SAP solution implementation and is essential part of gaining end user acceptance of the
system. This cycle occurs at the end of Realization Phase of the implementation, subsequent to integration testing
cycles to ensure that the system has been tested thoroughly by the project team and is ready to be released to the end
user community. The emphasis of UAT to demonstrate that the system functions as designed.

6.2.

Test Management

The project team members will assist end users in performing manual and automated testing techniques for UAT. Test
cases will continue to be used to perform manual tests; however Solution Manager, SAP Quality Center by HP, Testing
Acceleration and Optimization will be used to automate test cases into test scripts.
Solution Manager and SAP Quality Center by HP will be used to provide a detailed listing of business scenarios that
include configurable transactions, custom development objects, business process steps, business processes, and
business scenarios that will be tested during UAT. Test case and/or test script status will be recorded in Solution
Manager and SAP Quality Center by HP and be reflected in the project schedule. UAT status reporting will be included
in the weekly project status report.

6.3.

Test Documentation

Manual test cases and automated test scripts executed throughout scenario (integration) testing will be utilized for UAT.

6.4.

Test Data

Production data will be used for UAT.

6.5.

Test System

UAT will be performed in the QA environment

6.6.

Roles & Responsibilities


End users and/or subject matter experts perform all necessary UAT according to the plan using business
process procedures to execute test cases and/or test scripts with the assistance of the business system
analyst.

19

Business system analyst will identify and load master data

SAP Consultant will resolve configuration issues

Developer Team Members will resolve any issues with development objects

Legacy System SMEs support data conversion and migration activities. Validates and confirms data loads
and verifies data accuracy.

Basis/infrastructure team will be responsible for the creation and maintenance of the project environments as
well as enforcing necessary controls on the QA environment.

Security & Administration Team Member will resolve Security-related issues

Team Leads will ensure the creation, completeness, quality of test cases/scripts and testing, managing test
resources, and sign off on test results.

Project Manager & Integration Team Lead is responsible for the overall planning, tracking and reporting of
testing status and results.

6.7.

Entrance Criteria

Scenario (integration) testing is completed


Documentation updates as a result of changes made during the testing have been completed and approved
Configuration is frozen
o All configuration issues have been closed, documented, and approved.
o Business Process Procedures completed and approved.
o Configuration Documentation completed and approved.
Integrated change control process and procedure is established and in place
Data conversion and migration loads are successfully loaded
Quality assurance (QA) environment has been refreshed, accessible, and ready for UAT
Project team security profiles and roles are in evoked.
Testing tools are installed and deployed.
The testing tools to be used for testing have been installed, deployed, and ready for use.
UAT test plan is loaded in Solution Manager
Detailed inventory of business transactions, business process steps, and custom development objects,
business processes, and business scenarios to be tested
o Test data
o Test work packages
Completed UAT test schedule
Dedicated test lab or area is established to perform testing on standard AIM COMPANY issued desktops and laptops
Utilize security profile and roles for UAT
Schedule daily testing status meetings with testers to review daily test plan, dependencies, successor tests,
and outcome.

6.8.

Exit Criteria

UAT test cases and/or test scripts have been executed and defects resolved.

20

All UAT test cases and/or test scripts have been documented as required.
UAT test cases and/or test scripts have been executed.
UAT test cases and/or test scripts results have been reviewed and approved by functional leads and/or
technical leads.
All significant defects (business-critical or high integration impact) have been resolved and retested.

UAT results reported and approved


Weekly UAT results report - prepared, reviewed, and approved
System meets the business application requirements as defined in the Business Blueprint Document

7.

Technical System Testing

7.1.

Description

Technical system testing consists of technical functionality and performance tests and is basis/infrastructure-oriented.
Technical functionality tests aim to validate that the technical components of the production environment are working
properly. They include validating the following:

System administration procedures


Failure recovery
Disaster recovery

Performance testing determines the performance of the application, using an automated tool, to simulate a
representative user load that measures system resources and response times. This includes the base line of the
server and client response times.
Volume Testing this test identifies the maximum load a given hardware configuration can handle by representative
peak loads.
Stress Testing scenarios that simulate the peak loads, including concurrent connected & synchronizing remote users
& concurrent execution of other production jobs (interfaces, etc.)
The technical team, in partnership with the Project Leadership, will define the criteria for the successful completion of
each of the following tests:

7.2.

Technical Functionality Tests

Batch Cycle Test & Month/Quarter/Year-End Processing - Cycles for daily, monthly, quarterly and yearly batch will
be tested. Batch testing for interfaces and performance will be tested. The following will also be included in batch
testing.

Extracts in
Extracts out

Failure Test - The purpose of this task is to test and re-test failure (error) situations on the following:

Production system hardware


Operating system
Database
SAP system software

Training, trouble-shooting, and error-resolution procedures will also be developed by the technical team.

21

Disaster Recovery Test - The purpose of this task is to test and re-test the disaster recovery plan and procedures
defined for the production environment, as follows:
Tests third-party provider services and responsiveness
Tests that the entire technical infrastructure can be reproduced. This will include tests of the network, system
hardware, performance, printing, and user configuration.
Tests to verify that an acceptable disaster downtime window can be achieved. Corrective actions on any
unacceptable results will be taken (for example, procedures will be re-defined, downtime window expectations
will be reset, and other third-party providers will be investigated).
Changes to the Systems Operations Manual will be incorporated.
Backup and Restore Test - The purpose of this task is to test and re-test the backup and restore procedures defined
for the production environment until acceptable results are achieved. Multiple tests of data corruption scenarios will be
included (for example, disk failure), as well as user errors (for example, unintentional data deletion).
System Administration Test - The purpose of this task is to test and re-test the system administration procedures
defined for the production environment until acceptable results are achieved. As defined in the SAP Operations
Manual, the following activities of a system administrator will be tested:
Managing job scheduling
Administering corrections and transports
Reacting on SAP System alerts and logs
Printing and Fax Test - The purpose of this task is to test and re-test the implemented printing and fax functions, as
well as the related administration procedures, defined for the production environment until acceptable results are
achieved.
Going Live Check - The purpose of this task is to check system performance before going live using the SAP GoingLive Service. The Going-Live Service measures the following:

Server
Database
Applications
Configurations
System load

The results and recommendations are recorded in a status report.

7.3.

Performance Testing

Volume Test - The purpose of this task is to confirm that the system can carry the full volume of crucial SAP business
processes, enabling the identification of potential improvements to system performance before going live. Volume tests
will determine the following:

Transaction volume
Data volume
Required levels of performance for each critical SAP business process

As each test procedure is set up in this activity, the necessary volume of transactions and functions is generated. Once
the test has been run, the results will be reviewed and any changes to fine-tune system performance will be made.

22

Stress Test - The purpose of this task is to confirm that the configured production environment is viable for production
operation of all business processes, enabling the identification of potential improvements to system performance
before going live. The final test plan defines the following:

Transactions processed
Data volumes
Required levels of performance

The test will be carried out with both actual users and simulation scripts using TDMS with live data. Once the test has
been run, the results will be reviewed and any changes will be made to fine-tune system performance. Stress testing
will be performed in the QA environment and on the production server, if necessary. Stress test scenarios will be
designed to test specific cases that challenge the system (i.e., worst case scenarios).

7.4.

Test Management

There is no special tool provided by SAP to organize technical testing. SAP LoadRunner by HP and SAP Diagnostics
will be utilized for performance testing assist in mitigating performance risks, accelerating application delivery and
optimizing business agility using. The technical team will use SAP LoadRunner to validate and test the performance of
the software that supports the business requirements. This is usually a five-step process:
1) Define the business processes - The project team uses SAP LoadRunner to determine the optimum hardware
and software platform, based on the operations of the business process itself.
2) Develop a test script - The project team develops one or more test scripts to walk the software through the
actual screens and entries that will be used by the business process. This script represents the activities of an
end virtual user.
3) Expand the user load - SAP LoadRunner now lets the project team increase the virtual user load in order to
simulate the peak loading requirements for this business process.
4) Evaluate results - SAP LoadRunner measures simulated performance against the key performance indicators
for the process and then recommend any necessary changes to the related enterprise software systems.
5) Retest to validate - The team then uses SAP LoadRunner to rerun the tests, using the modified environment, to
validate the effectiveness of the changes.

7.5.

Test Documentation

Manual test cases and automated test scripts executed throughout scenario (integration) testing will be utilized for
performance testing.

7.6.

Test Data

Production data will be used for performance testing.

7.7.

Test System

Technical System Testing will be prepared in the QA system, but must be executed on the Production system.

7.8.

Roles & Responsibilities


Integration Lead and Technical Lead supervises the overall technical system and development tests.
Development Team Lead provides direction in the planning and execution of the technical and performance
testing.
Basis Technical Lead is responsible for the planning and execution of all technical and performance testing.
Project Team Members execute testing under the direction and supervision of the Basis Technical Lead, the
Development Team Lead, Integration Lead and Technical Lead.

23

7.9.

Team Leads will ensure the creation, completeness, quality of test cases/scripts and testing, managing test
resources, and sign off on test results.
Project Manager & Integration Team Lead is responsible for the overall planning, tracking and reporting of
testing status and results.

Entrance Criteria

Unit testing for final configuration is completed


Documentation updates as a result of changes made during the testing have been completed and approved
Integrated change control process and procedure is established and in place
Configuration is frozen
Data conversion and migration loads are available and ready for testing
Quality assurance (QA) environment has been refreshed, accessible, and ready for technical testing
Project team security profiles and roles are in evoked.
Testing tools are installed and deployed.
The testing tools to be used for testing have been installed, deployed, and ready for use.
Technical team members have been trained on the use of testing tools.
Technical Functionality Testing and Performance Testing plan is loaded in <<Solution Manager, SAP Quality Center,
Sharepoint, etc>>
Detailed inventory of business transactions, business process steps, and custom development objects,
business processes, and business scenario to be tested for performance testing
o Test data
o Test work packages
Completed Technical Functionality Testing and Performance Testing schedules
Dedicated test lab or area is established to perform testing on standard AIM COMPANY issued desktops and laptops
Utilize security profile and roles for testing
Schedule daily testing status meetings with testers to review daily test plan, dependencies, successor tests,
and outcome.

7.10.

Exit Criteria

The technical team leads and Project Leadership approve all test results
All Technical and Test Plans are updated
All issues related to Technical System testing are resolved
The project team can verify that the system is stable and ready for cutover.

8.

Defect Management

A defect is a test problem or error that must be corrected. Defects will be written for SAP Solutions, legacy
applications, and manual steps that are part of the testing scope of the program or project. A defect is formally
documented in a program or tool for managing test defects. Defects can become project issues if the solution does not
meet the business or technical requirements of the customer.
The following defect management tool will be used for tracking and monitoring testing defects during the project
implementation:
Defect Management

Capabilities

Comments

24

Tools
SAP Quality Center by
HP

SAP Test Workbench


Test Organizer

Other: Sharepoint
Software, Livelink
Software, etc.

The defect management


component supports the entire
defect life cycle from
detecting the problem, to
assigning a resource to the
defect, to fixing the defect and
verifying the fix. Before a new
defect is reported, SAP Quality
Center checks the database for
similar defects, eliminating the
need for manual checking.
Integrated in the SAP Solution
Manager for Test Organization
and Execution, along with
problem message handling
administration and tracking of
errors
Customized repository to
manually track and record
defects.

Integrated with
automated test tools

Not integrated with


automated testing tools
outside of Solution
Manager
No integration with
automated testing tools

25

Below represents a general testing process during a testing cycle when a defect is found:

Figure 3-1, Illustration of General Testing Process

The tester assigned to test case or test script is responsible for entering the defect into the Defect Repository by
specifying the following:
Defect ID
Status
Severity is the prioritization of the defect:
o Severity 1 (1-Critical) Serious errors that prevents or stops testing of a particular function or serious
data type error (i.e., system locks-up).
o Severity 2 (2-High) Serious or incorrect functionality errors, incorrect data, or significant Load
problems that may make the application unusable (i.e., login takes over 5 minutes; query takes 10
minutes, etc.).
o Severity 3 (3-Medium) Defects should not prevent or hinder the functionality or Load of the system
(i.e., an incomplete phone number string is returned).
o Severity 4 (4-Low) Defects that do not prevent or hinder functionality of the system, which is normally
confined to the User Interface (i.e., missed spelled word).
Note: When tester identifies a potential Severity One (1) or Two (2) defects, the tester will contact the test lead
immediately to implement the appropriate corrective action
Summary (brief description of defect)
Project (Cycle of Testing)
Project Area (i.e., Functional/Technical Team).
Description (detailed description of the defect)
R&D Comments (update the history associated with a defect)
Attachments (i.e., pictures)
Detected On Date

26

Close Date
Defect Type use an appropriate value to define the tracking status of a defect.
o Code: Defect is within the code.
o Data: Bad data or data errors
o Enhancement: Used with a Change Control Procedure process to request change
o Other: Use only once you receive approval from the Test Lead
o Performance: Based on a Performance problem
o Specification: Requirements are not correct
o UI/Cosmetic: UI error (i.e., misspelled label)
o Usability: Unable to perform a task
o Defect Status - Use the appropriate value to define the tracking status of a defect.
o Closed: The identified test case with a defect passes all applicable regression tests
o Duplicate-Closed: The Test Lead sets this status if a test step/process is determined to be no longer
valid.
o Fixed: Developer corrected and unit tested defect and it is ready for retest.
o Deferred: Further investigation is required to identify if this is a true defect
o Fixed Pending Build: Defect is fixed and the testing SRF is pending the update.
o New: Defect was entered into the system has not been validated by the Test Lead for assignment
o Open: Defect is not assigned to anyone to fix
o Re-open: Defect is found after it was closed

27

Severity 1 & 2 Defects escalation process to remedy test defect

Figure 3-2, Example of Defect Escalation Process for Severity 1&2 Defects

The next step is to pass the defect through the Developer Fix Process, where the developer or consultant who
executed the configuration works to resolves defect:

Figure 3-3, Illustration of Developer Fix Process.

The development/configuration team is responsible for updating the defect information in the Defect repository by
specifying the following:
Assigned To (developers name)
Status
Assigned Date
Closing Date
R&D Comments (what action was taking to correct a defect)
The Integration Lead or Test Lead is responsible for periodically checking Defect Repository to validate when a defect
is fixed and ready for retest or to escalate when a Severity 1 or 2 are impacting the test schedule. Once a defect is
successfully tested (and relevant Regression testing has taken place), the Tester will update the status in Defect
Repository.

28

9.

Testing Roles & Responsibilities

An Integration Lead or Test Lead (TL) is responsible for managing the test team. A TL has thorough knowledge of
functional testing using structured testing techniques, including automated testing techniques and their application
across differing projects. Additional responsibilities and characteristics include the following.
Develops Test Strategy, Assessments and Test Plans
Oversees that projects are on successful and on schedule
Work daily w/customers and act as Point of Contact (PoC) for escalations
Provides direction and support for test team
Test Management including tracking, reporting of testing progress, defects, risks & issues
Team Leadership
Report test results, status reviews & lead defect management process
Identification and escalation of issues and risks
Test Architect (TA) provides technical guidance and has thorough knowledge of testing using structured testing
techniques, including automated testing techniques that include the following responsibilities and characteristics.
Team Leadership
Communicates with business to obtain more detail for the individual business processes
Ensures that the exact business process is defined at the step level
Assist in identifying system requirements
Install and configure automation software
Creates test scripts and manages test script automation. (virtual project teams and/or line responsibility)
Point of contact for upper management and project management when there is no Test Manager
Represent test team in periodic review meetings (defect review, build, etc.)
Participate in definition and development of test plans, test cases and participate in manual functional testing
Review teams results, defects and test cases to ensure adherence to established testing standards
Project Team members are responsible for test automation and execution.
Create test scripts using third party tool (i.e., HP, Compuware, etc.)
Define test execution scenarios
Develops, debug, run automated test scripts using automation tool such as QTP
Execute manual and automated tests, report issues and defects
Establish, conduct and control testing scenarios and predictive outcomes
Perform maintenance on automated test scripts
Report project status to the Test Lead
Track metrics on defects, test results, etc.
Comply with Change Management requirements
Record test results
Provides direction & requirements during test assessments and scoping

29