Beruflich Dokumente
Kultur Dokumente
DD-MM-YY/ Version
Revision History
Versio Date Author( Reviewer(s) Change Description
n s)
Copyright Information
This document is the exclusive property of XXX Corporation (“XXX”); the recipient agrees that he/she may not copy,
transmit, use or disclose the confidential and propriety information set forth herein by any means without the
expressed written consent of XXX. By accepting a copy hereof, the recipient agrees to adhere to and be bound by
these conditions to the confidentiality of XXX's practices and procedures; and to use these documents solely for
responding to XXX’s operations methodology. All rights reserved XXX Corporation, 2000. XXX IT reserves the right to
revisit Business Requirements and Functional Specifications Documents if approval to proceed is not received within
90 days of the issue date.
Test Plan XXX
1 INTRODUCTION........................................................................................................................................ ...7
4.1 Scope..............................................................................................................................................................................8
4.5 Responsibilities...........................................................................................................................................................11
11.1 Purpose......................................................................................................................................................................20
11.2 Responsibility............................................................................................................................................................20
11.3 Environment.............................................................................................................................................................20
12.1 Purpose......................................................................................................................................................................21
12.2 Responsibility...........................................................................................................................................................21
12.3 Environment.............................................................................................................................................................21
13.1 Purpose......................................................................................................................................................................24
13.2 Responsibility...........................................................................................................................................................24
13.3 Environment.............................................................................................................................................................24
14.1 Purpose......................................................................................................................................................................28
14.2 Responsibility...........................................................................................................................................................28
14.3 Environment.............................................................................................................................................................28
15.1 Purpose......................................................................................................................................................................29
15.2 Scope..........................................................................................................................................................................29
15.3 Responsibility...........................................................................................................................................................29
15.4 Environment.............................................................................................................................................................29
16.1 Purpose......................................................................................................................................................................35
16.2 Responsibility...........................................................................................................................................................35
16.3 Environment.............................................................................................................................................................35
17.1 Purpose......................................................................................................................................................................36
17.2 Responsibility...........................................................................................................................................................36
17.3 Environment.............................................................................................................................................................36
1 Introduction
XXX is in the process of re-engineering and re-designing the XXX.com website. XXX has been working with
XYZ Private Limited to complete the architectural and detailed design of the new XXX.com site. At this
time the development phase of the project is underway and the site launch is planned for March 2005.
Quality Assurance Testing is the joint responsibility of the XYZ and Business team from Budget. The
purpose of this document is to provide an overview of the testing process for the XXX.com project. This
document will be distributed to all Project Managers for review. It is the responsibility of the Project
Managers to distribute this document to the appropriate team members for review where necessary.
Please refer following document in Sharepoint for team structure of eCommerce XXX.com Redesign
project:
The foundation of the Integration and System Testing Plans and Scripts are based upon the information
provided in the signed of sections of the Booking Engine Use Cases EBR - Budget.com, Non-Booking
Engine Use Cases XXX.com Redesign EBR Page Specifications for Ecommerce XXX.com Redesign,
Non-Functional Requirements XXX.com Redesign EBR, External interfaces document and Data feeds,
Technical Description – Query String Parameters and Splash Pages list. Any changes to these documents
will follow the appropriate channels of the Change Control Board. Once changes are approved, the test
plans and test scripts will be modified accordingly.
The detailed testing project plan is part of the master project plan.
As test plans and test scripts are completed and assigned a version number, they will be placed in
Sharepoint portal under Test Documents folder. As plans and scripts are completed or modified,
notification will be sent to the appropriate individuals. Scripts created for the purpose of Testing will be
placed in the above referenced Test Documents folder under Integration, QA and UAT.
4 Testing Strategy
4.1 Scope
• Unit
• Integration
• System
• Regression
• Load
• User Acceptance
Participation from all development areas will be required. Each Development Project Plan should account
for development participation in each phase of testing. It is anticipated that the level of developer
involvement will decrease as the testing progresses.
• Manual test script generation will be the preferred method until such time that the Testing
Project Manager determines that site stability is adequate for automated script creation.
• A-Pass: focuses on “normal” conditions to ensure all parts of the application are working in a
normal test script. Immediate identification of major issues is required.
• B-Pass: focuses on “exception” conditions to ensure boundary conditions, error handling, and etc.
are working correctly. Immediate identification of major issues is required.
• Normal (N): Test scripts that test the expected behavior under normal, or “pass” conditions.
• Exception (E): Test scripts that test the expected behavior under exception, or “fail”
conditions.
• Data Normal (DN): Test scripts that test the expected behavior under data-specific normal
conditions.
• Data Exception (DE): Test scripts that test the expected behavior under data-specific
exception conditions.
• Iteration: One (1) complete end-to-end A-Pass and one (1) complete end-to-end B-Pass across all
modules.
• Unit test script creation and execution is the responsibility of the development staff(s)
• Number of iterations for Integration Testing will be on an as needed basis within the Integration
Testing Cycle. This will be determined by the Testing Project Manager and Customer Application
Project Manager(s) during the Integration Testing Cycle.
• Integration Testing will include all (N) test scripts during the A-Pass and (E) test scripts during the
B-Pass.
• Number of iterations for System Testing will consist of up to three. Should issues arise that justify
additional iterations, the testing timeline will increase by five (5) days per iteration.
• System Testing will include all (N) & (DN) test scripts during the A-Pass and (E) & (DE) test scripts
during the B-Pass
• User Acceptance Testing: Test scripts to be created by QA team with the help of Business Analyst
and User Acceptance Group (Business).
• Load Testing will consist of a select group of (N) scripts that accurately represent a cross section
of functionality against a predetermined load.
• Regression Testing will be created from the (N), (DN), (E), & (DE) test scripts.
The diagram below gives a high level overview of the proposed System.
Browser
Host 3
Host 1
TeamSite
Web server
(iPlanet 7.1)
Open Deploy
Host 2
Personalization
MUX app.
Host 4
App DB Person
App DB
DB
(Oracle
9i)
(Oracle
Nightly
XXX feed DD-MM-YY Page 9 of 39
Test Plan XXX
Budget Highway
Until 11/12 Unit Testing Local Box
Test
Budget Highway
11/15 to 3/30 Bug fixes Local Box
ATR
Budget Highway
12/1 to 1/14 QA Testing QA Environment
ATR
Testing by
Business
Budget Highway
12/13 to 1/14 (M/F transactions QA Environment
for Rates & ATR
Reservation)
Budget Highway
1/17 to 2/11 Limited UAT QA Environment
ATR
Production Budget
3/7 to 3/18 Soft Launch
Environment Production
Limited testing
4.5 Responsibilities
1 Functionality based on use Ensure that test cases cover all requirements listed in
cases -- Booking engine, signed-off documents - Booking Engine, Non-booking
non-booking engine, BCD Engine, and NFRs.
admin tool, non-
functional requirement
3 Splash pages hosted under Review the list of splash pages and include them in the
XXX.com for partners overall testing project plan.
4 Requests from other sites 1. Check with the business if there is a master list with
with specific URL the list of external websites from where XXX.com website
parameters to the is invoked.
XXX.com website 2.Discuss with the technical team and business on the list
of URL parameters that will be supported in the new
XXX.com webs
5 Testing of the static Business needs to complete this list. This activity will
content pages in the site start in the month of November. Review the list of static
content pages prepared by business and include it in the
overall testing project plan.
6 Fast break front end Amit to have a preliminary discussion with Hans to
application and the admin understand the functionalities. Request Hans to create
tool basic test scenarios. Amit to include "load test" of Fast
Break in test planning.
7 Indigio managed admin Ask Indigio team to come up with test plan, test cases and
tools -- affiliate include them in the overall testing project plan. Planning
management tool, to get test results and updates during the testing phase.
location correction admin
tool
8 Testing of the new Alfredo confirmed that the mainframe team will perform
modified mainframe the unit testing and QA for all mainframe transaction
transactions changes (PSR items).
9 Regression testing of the Alfredo confirmed that the mainframe group will perform
all the mainframe the regression testing of all mainframe transactions that
transactions are used in Budget.com.
10 Sending Emails and Email Amit to pass on relevant XXX.com test cases to E-Dialog.
campaign management - For e.g., Reservation Confirmation, Reservation
E-dialog Reconfirmation emails being sent etc. Get their validation
on the test case.
11 Reporting -- basic testing Review the tagging requirements from Indigio and also
include them in the master testing project plan
12 Reporting -- extensive Ask Indigio team to come up with test plan, test cases and
testing including analytics include them in the overall testing project plan. Planning
reports to get test results and updates during the testing phase.
13 Outage component Need to discuss with the technical team and also the IBM
further regarding the testing. Planning to get test results
and updates during the testing phase.
14 True north (mapping tool) Ask Indigio team to come up with test plan, test cases and
with misspelling include them in the overall testing project plan. Planning
corrections. to get test results and updates during the testing phase.
15 Production testing Get detailed plan from business outlining test scenarios,
test data, group responsible for testing and also the
schedule. Include it in the overall testing project plan.
Note: Lead System Tester (XYZZ) will be "accountable" for completion of testing activities listed below.
However, responsibilities for executing the tests below will be with folks identified below.
System failure and loss of Medium Low A database backup strategy should
data during the Testing be in place so that loss of data can
Process be prevented.
Test data not migrated in Medium Low Test the functionality not involving
time data feed until migration.
Connectivity during test Low Medium Local test setup should be in place.
execution from offshore Except scenarios involving
mainframes, others can be executed
locally.
Sanity test will be carried out on every build received from development team to ensure suitability of
application for further testing. A set of functional test cases will be identified to run during sanity test.
Testing will be suspended when it is not possible to proceed with test execution due to major
showstopper error in the application.
Test Cases will be developed by Test team and reviewed by Business before test execution. In case of
requirements change, refer the Change Request Process defined in Approval section.
6 Testing Team
6.1 Core Team
- System Testers:
Please refer to Sharepoint for Technical support team details of eCommerce XXX.com Redesign project:
Any reference to the Testing Team will be those individuals listed above under “Core Team.”
7 Testing Tools
7.1 Testing Tools
The testing tool related decision is pending for budget approval. The above listed tools are the proposed
testing tools.
QTP will be used for regression testing and Load Runner for load testing.
The test scripts will be created and executed by XYZ Testing Team offshore. The test scripts will be shared
with XXX for onsite execution at the later part of System Testing and again just prior to implementation.
• Data feeds
9 Metrics collection
Detailed defect analysis shall be done for the reported defects and test case execution status shall be
reported for each module.
The metrics to be collected during test life cycle are:
1. Defect location Metrics – Defects raised against the module shall be plotted on a graph to indicate
the affected module.
2. Severity Metrics – Each defect has an associated severity (Critical, High, Medium and Low), which is
how much adverse impact the defect has or how important the functionality that is being affected by
the issue. Number of issues raised against severity shall be plotted on a graph. By examining the
severity of a project’s issues, the discrepancies can be identified.
3. Defect Closure Metrics – To indicate progress, the number of raised and closed defects against time
shall be plotted on a graph.
4. Defect Status Metrics – It will indicate the number of defects in various states like, new, assigned,
resolved, verified, etc.
5. Re-opened bugs – The number of defects re-opened by testing team once they are fixed by
development team shall be reported & percentage shall be calculated with respect to total number of
defects logged.
6. Test case progression trend: This trend shall indicate the progress of test execution module wise. It
shall state the number of test cases planned, executed, passed and failed.
These metrics shall be collected and presented as test summary report after each test cycle. Also, these
shall be part of weekly status report.
Refer Templates section under Sharepoint for Metrics Analysis template.
10 Classification of Issues
The following standard will be used by all involved parties to identify issues found during testing:
Severity 1: Critical Issues: Application crashes, returns erroneous results, or hangs in a major area of
functionality and there is no work around. Examples include the inability to navigate to/from a function,
application timeout, and incorrect application of business rules.
Severity 2: High Functional Issues: Functionality is significantly impaired. Either a task cannot be
accomplished or a major work around is necessary. Examples include erroneous error handling, partial
results returned, and form pre-population errors.
Severity 3: Medium Functional Issues: Functionality is somewhat impaired. Minor work around is
necessary to complete the task. Examples include inconsistent keyboard actions (e.g. tabbing), dropdown
list sort errors, navigational inconsistencies, and serious format errors causing usage issues (e.g. incorrect
grouping of buttons).
Severity 4: Low Functional Issues: Functionality can be accomplished, but either an annoyance is
present, or efficiency can be improved. Cosmetic or appearance modifications to improve usability fall
into this category. Examples include spelling errors, format errors, and confusing error messages.
11 Unit Testing
11.1 Purpose
The purpose of Unit Testing is to deliver code that has been tested for end-to-end functionality within a
given module and normal interfacing between dependent modules in the development environment.
11.2 Responsibility
Testing will be the responsibility of the individual developers. Ultimate signoff for promotion into
Integration Testing will be the responsibility of the Development Project Manager(s). Configuration
management, builds, etc. will be the responsibility of the Configuration Management Team at the
direction of the Development Project Manager.
11.3 Environment
• Be tested for complete threads for all code, from UI, to data access, and back to UI using test
data created by developers
• Be tested for one example each of normal, high, and low boundary conditions for Data input
where appropriate
• Successfully execute pairwise test as required for inter-module interfaces, including likely error
conditions (e.g. common data entry error)
Creation of test data and scripts for the purpose of Unit Testing is the responsibility of the development
staff(s).
12 Integration Testing
12.1 Purpose
The purpose of Integration Testing is to deliver code that has been comprehensively tested for Normal (N)
and Exception (E) conditions across all modules in the Development environment.
12.2 Responsibility
Testing Team holds the primary responsibility for the execution of Normal (N) and Exception (E) test
scripts. All N & E type test scripts will be completed prior to the start of Integration Testing. The N & E
test scripts will be executed for the following modules:
• Homepage
• Reservations
• Rates
• Analytics
• Profile Management
• Locations
• Personalization
• Search
• Admin tools
• Visitor Management
• Content Management
• Administration
Configuration management, builds, etc. will be the responsibility of the Configuration Management Team
at the direction of and agreement the Development Project Managers and Testing Project Manager.
Ultimate sign-off of Integration Testing and promotion into System Testing resides with the Testing Project
Manager.
12.3 Environment
A Testing Data Repository Document will be delivered on or before Integration Testing. Specific reference
will be made in the N & E Test Scripts to the data types listed in the Testing Data Repository Document.
Issue Identification: Integration Testers will log issues as they are identified.
Issue Resolution: It is expected that the Development Team(s) will undertake issue resolution based on
the severity and priority. All efforts will be made to turn around the Critical/High category issues, in
the next scheduled Build/Release.
Build and Release Process: The Configuration Management Team will deliver fresh builds as requested
by the Development Project Managers and Testing Project Manager along with release notes.
Issue Closure: After each build, the Testing Team will review the issues that have been resolved in
order to verify and close/re-instate the issues and resolution priority. All effort will be made to close
the resolved issues as soon as possible.
Issue Tracking: The Testing Project Manager will be responsible for the administration of the issue
tracking tool.
• For transaction-based data access, be tested successfully for “normal” and “exception”
conditions
Test scripts will be provided to the Development Project Managers prior to Integration Testing. The test
scripts provided should be used as a baseline for exit criteria expectations. Any additional test or scripts
that the development staff deems necessary will be left at the discretion of the Development Project
Managers. Should the Development Project Managers feel that such scripts should be incorporated into
the Testing Team scripts, they may request such to the Testing Project Manager. It will be the
responsibility of the Testing Project Manager to analyze the feasibility of such incorporation.
Assumptions: Functionality testing of XXX.com by the Testing Team will also include entry points from
other websites via link, travel portals, etc.
Exclusions from Integration Testing: Delivery of code that has been comprehensively tested for Data
Normal (DN) and Data Exception (DE) conditions across all modules in the Development environment. Any
issues discovered with the informative pages, creative design, or content should be reported to respective
development area.
The purpose of System Testing is to deliver code that has been comprehensively tested and functionality
that is certified to be end-to-end user ready in the System Test environment.
13.2 Responsibility
The Testing Team holds the primary responsibility for the executions of Normal (N), Exception (E), Data
Normal (DN), and Data Exception (DE) test scripts. Test scripts will include field form validation and
display rules as stated in the Elements section of the Page Specifications for eCommerce XXX.com
Redesign. The N, E, DN, & DE test scripts will be executed for the following modules:
1. Homepage
2. Reservations
3. Rates
4. Analytics
5. Profile Management
6. Locations
7. Personalization
8. Search
9. Admin tools
12. Administration
The System Testing Team will be comprised of individuals from the Testing Staff. Configuration
management, builds, etc. will be the responsibility of the Configuration Management Team at the
direction of the Development Project Managers and requires the agreement of the Testing Project
Manager. Ultimate sign-off of System Testing and promotion into User Acceptance Testing resides with the
Testing Project Manager.
13.3 Environment
System Test Script execution will be completed as per the following Operating System/Browser matrix:
IE 6.0 C C
IE 5.5 U
IE 5.0 U U
Mozilla 1.7.2 C U
Netscape 7.1 U
AOL 5.0 U
A Testing Data Repository Document will be delivered on or before System Testing. Specific reference will
be made in the N, E, DN, & DE Test Scripts to the data types listed in the Testing Data Repository
Document. Additional specific data may be required. Should this be the case, the data will be listed on
the corresponding test script.
Issue Identification: System Testers will log issues as they are identified.
Issue Resolution: It is expected that the Development Team(s) will undertake issue resolution based on
the severity and priority. All efforts will be made to turn around the Critical/High category issues, in
the next scheduled Build/Release.
Build and Release Process: The Configuration Management Team will deliver fresh builds to the System
Test environment, as directed by the Testing Project Manager along with release notes. If the situation
warrants, an emergency build may be released. The Testing Project Manager and all Development
Project Managers must be in agreement to proceed with the emergency build.
Issue Closure: The Testing Team will review the issues, which have been resolved, to verify and
close/re-instate the issues and resolution priority. All effort will be made to close the resolved issues
as soon as possible.
Issue Tracking: The Testing Team will be responsible for the administration of the tracking tool.
• Performance/Load tested
• No Severity 1 or 2 issues
• Security Testing
o DB crash
o iPlanet crash
o Hardware
o DB capacity/resources
Regression and Load testing will take place prior to promotion to the User Acceptance Testing. Please see
the Regression Testing and Load Testing sections of this document for further information.
A copy of the test plan will be provided to the User Acceptance Group prior to System Testing for their
review. The test plan provided should be viewed as a baseline for System Testing exit criteria
expectations. Any items in the test plan that the System Testing Team or User Acceptance Group feels
should be modified or added should be submitted to the Testing Project Manager. It will be the
responsibility of the Testing Project Manager to analyze the feasibility of such incorporation or
modification.
Assumptions:
Functionality testing of XXX.com by the Testing Team will also include entry points from other websites
via link, travel portals, etc.
Visitor tracking details will be verified only at jsp level by “View Source” as Reporting tool has not been
finalized.
14 Mainframe testing
14.1 Purpose
The purpose of Mainframe testing is to deliver the stable code for new functionality – Rate Shop and also
to verify if existing functionality works fine with new XXX.com application.
14.2 Responsibility
Mainframe testing will be carried out by Mainframe QA team at XXX. It will be scheduled and coordinated
by XYZ Test team according to test execution dates for System testing, UAT & PAT.
14.3 Environment
Mainframe modules will reside in the Budget Highway Acceptance Test Region (ATR).
Mainframe QA team at XXX will deliver the unit tested code of Rate Shop feature to XYZ Development
team. After integration with application, XYZ Test team will verify the Rate Shop feature from end to end
user perspective, i.e. from front end till mainframe database. XYZ Test team will be trained on using
Mainframe screens to verify rates and reservation data. XYZ Test team will raise issues using PVCS Tracker
and escalate to IT Project Manager (Alfredo Palacios), who will take it further with Mainframe team for
fixes.
To follow up on Mainframe testing progress (during mainframe testing period), a status report will be
provided to Budget team on weekly basis by Mainframe QA team.
During test execution, XYZ test team will provide with list of reservations to Mainframe QA team so that it
can verify converting reservations to rentals and generating rental agreement number successfully.
15 Load Testing
15.1 Purpose
The purpose of Load Testing is to deliver code that has been load tested and is ready for promotion into
the Production Environment.
15.2 Scope
Load Testing will consist of a select group of (N) scripts that accurately represent a cross section of
functionality. Scripts will be executed to generate up to 1500 user peak load levels.
Tests will be executed for 1500 concurrent users at load levels of 50, 100, 200, 500, 1000 and 1500. The
test execution would be completed when the 2000 user load is ramped up or any failure condition
necessitates stopping the test. The team would monitor the test execution and record the timings, errors
for report preparation.
15.3 Responsibility
The creation and execution of the Load Testing Scripts is the responsibility of the Testing Team. Ultimate
authority rests with the Testing Project Manager, who will be in close contact with User Acceptance
Group.
15.4 Environment
The Mercury – Load Runner tool will be physically located on a server at Denver. For the purpose of test
execution, Load Runner tool will be pointed to the System Testing Environment, which will become the
Production Environment upon Implementation. XYZ Test team will access Load Runner using remote client
tool to execute the scripts. Offshore Test team will be allocated one VU Gen license to create scripts
offline.
Description IP Address
Budget Application
Servers
Controller
Load Generator
DB Server
Web Server
15.5.1Load testing
Load testing will be carried out under varying workloads to access and evaluate the ability of the system
under test to continue to function properly under these different workloads. The goal of load testing is to
determine and ensure that the system functions properly beyond the expected maximum workload.
Additionally, load testing evaluates the performance characteristics (response times, transaction rates,
and other time sensitive issues).
15.5.1.1Serviceability
Approach
- Determine the serviceability of the system for a volume of 1500 concurrent users.
- Measure response times for users
Steps
1. Virtual users estimation: Arrive at a maximum number of concurrent users hitting the system
where the system response time is within the response time threshold and the system is stable. This
number would be the virtual user number and should be higher by a factor of x times the average load.
Corporate Program
XXX Employee
Travel Agent
Unaffiliated consumer
XXX Partner
UB Program member
PD member
FB member
member
Homepage load
Login/Logout
Rate Request
response
Rate Request –
response for multi
BCD rate shop
Create a booking
Modify/Cancel
booking
Schedule for concurrent user testing with a mix of user scenarios and the acceptable response
times:
Statistics
• The graph with y-axis representing response times and x-axis concurrent users will depict the
capability of the system to service concurrent users.
• The response times for slow users will provide worst-case response times
15.5.2Endurance testing
Validate systems behavior for continuous hours of operation for projected load conditions.
Approach
- Endurance testing – check resource usage and release namely; CPU, Memory, Disk I/O and network
(TCP/IP sockets) congestion for continuous hours of operation
- Determine the robustness - check for breakages in the web server, application server and data
base server under CHO conditions.
Steps
1. Arrive at a base line configuration of the web server and application server resources i.e. CPU,
RAM and Hard disk for the endurance and reliability test.
2. The test would be stopped when one of the components breaks. A root cause analysis is to be
carried out based on the data collection described under the server side monitoring section.
- Collect data for analysis to tune the performance of web server, application server and database
server
- If there is an alarm support in the tool through an agent, check for alerts when the activity level
exceeds preset limits.
- If there is a load balancing configuration deployed, check if it is able to distribute the requests
Result
The result of this test will be a proof of confidence for Continuous Hours of Operation. The data
collected in this phase would give pointers to improve the reliability of the system and fix any
configuration, component parameters for reliable performance.
The test cycle shall be run for 50 users initially (lets say incrementing 5 users per 5 seconds till it reaches
50 concurrent users). The test shall be stopped if application crashes before reaching 50 users and issue
shall be reported to development team. The response time shall be noted for 50 concurrent users before
stopping the test. If the response time is exceeding the benchmark limit, load test shall be stopped until
development team fixes the issue. If the response time is well within benchmark limit, fresh test cycle
shall be run with an aim to reach 100 concurrent users. The same process shall be used until 1500
concurrent users target is met within acceptable response time.
The response times will be noted for the following user loads within the same test cycle:
50 users
100 users
200 users
500 users
1000 users
1500 users
The first cycle of Load testing will be carried out on QA environment and second cycle on Production
environment during System testing phase.
• Response Time
• Throughput
• Concurrent users
• Processor Usage
• Memory Usage
Assumptions
1. The Transaction mix (user mix) shall be provided by XXX Business team.
2. The XXX Team shall provide the application setup. The application provided would have ensured
the following.
Constraints
If Load test scripts shall be executed from offshore, network delay shall add up in response times.
In order for Load Testing to be considered successful the Load Scripts must be successfully be executed
under the following conditions:
• Meet the exit criteria for the Phase in which the Load Test is executed
16 Regression Testing
16.1 Purpose
Deliver code that has been regression tested and is ready for promotion into the Production Environment.
Regression Testing will consist of a majority of (D), (N), (DN), and (DE) type test scripts.
16.2 Responsibility
The creation of the Regression Testing Scripts is the responsibility of the Testing Team. Regression Test
Scripts will be created and executed using Mercury – Quick Test Pro. The execution of the Regression
Testing Scripts is the responsibility the Testing Team.
16.3 Environment
The Mercury – Quick Test Pro software will be physically located in XYZ, Bangalore. For the purpose of
test execution, QTP will be pointed to the System Testing Environment.
In order for Regression Testing to be considered successful the results must meet the exit criteria stated
in the corresponding testing phase exit criteria. For example, Regression Scripts executed during the
System Testing phase must meet the exit criteria stated in the System Testing section of this document.
The purpose of User Acceptance Testing is to deliver code that has been tested by the User Acceptance
Test Group and functionality that is certified to be end-to-end user ready for promotion into the
Production Environment.
17.2 Responsibility
User Acceptance Testing is to be executed by the User Acceptance Group (Business). Management of the
User Acceptance Testing Phase will be the responsibility of the Testing Project Manager via the User
Acceptance Group Coordinator. The test scripts used during User Acceptance Testing are to be created by
Test Team with the help of Business Analyst and User Acceptance Group. Test scripts should accurately
reflect the functionality documented in the Booking Engine Use Cases EBR - Budget.com, Non-Booking
Engine Use Cases XXX.com Redesign EBR and Page Specifications for Ecommerce XXX.com Redesign.
Ultimate authority rests with the Testing Project Manager, who will be in close contact with the User
Acceptance Group Coordinator. Configuration management, builds, etc. will be the responsibility of the
Configuration Management Team at the direction of the Development Project Manager and requires the
agreement of the Testing Project Manager.
17.3 Environment
The requesting of data migration/creation is the responsibility of the Testing Project Manager. Details
surrounding the migration/creation will be forwarded to the appropriate individuals. A Testing Data
Repository Document will be delivered on or before User Acceptance Testing. Specific reference will be
made in the N & E Test Scripts to the data types listed in the Testing Data Repository Document.
Additional specific data may be required. Should this be the case, the data will be listed on the
corresponding test script.
Issue Identification: UAT Testers will log issues as they are identified.
Issue Resolution: It is expected that the Development Team(s) will undertake issue resolution based on the
severity and priority. All efforts will be made to turn around the Critical/High category issues, in the next
scheduled Build/Release.
Build and Release Process: The Configuration Management Team will deliver fresh builds to the UAT Test
environment, as directed by the Testing Project Manager along with release notes. If the situation
warrants, an emergency build may be released. The Testing Project Manager and all Development Project
Managers must be in agreement to proceed with the emergency build.
Issue Closure: The UAT Testing Team (Business) will review the issues, which have been resolved, to verify
and close/re-instate the issues and resolution priority. All effort will be made to close the resolved issues
as soon as possible.
Issue Tracking: The Testing Team will be responsible for the administration of the tracking tool.
UAT group
UAT
UAT group tester logs Yes
UAT coordinator coordinator Developer
tester defect in PVCS
reviews defect for assigns fixes
identifies Tracker and
validity & details defect to defect
defect assigns to UAT
developer
coordinator
No
No
New
application UAT group Yes
Test team
version is tester
verifies the Defect passed? Close defect
released into verifies fixed
defect
UAT env. with defect
Release Notes
17.6 Exit Criteria
In order to be accepted for promotion to the Production Environment, the application must:
• No Severity 1, 2 or 3 issues
• User Acceptance Group must give the approval for Severity 4 issues to be included into production
release
• Security Testing
o DB crash
o iPlanet crash
o Hardware
o DB capacity/resources
Regression and Load testing will take place prior to promotion to the Production Environment. Please see
the Regression Testing and Load Testing sections of this document for further information.
• Any changes or new functionalities that come up during UAT will go through Change Management
process
18 Soft Launch
TBD (will enter details after discussing with Business)