Sie sind auf Seite 1von 100

Citibank, N.A.

User Acceptance Testing


Life Cycle Version 2.0
Global Cash Management Services

UAT
1 OVERVIEW

1.01 Introduction
Introduction to UAT Procedures
Explanation of Document Organization
Overview of Functions
Overview of Forms
Overview of Project Development Life Cycle (PDLC)
Overview of Glossary, Revisions and Concurrence
1.02 Road Maps
Introduction to Road Maps
UAT Team Leader Road Map
UAT Team Member Road Map

2 UAT FUNCTIONS

2.01 Documentation Review


Required Documentation
Key Documentation Preparation and Review
Documentation Review Criteria
2.02 Requirements/Design Change Management
Introduction to Requirements and Design Change Management
Procedure Scope: Types of Work Products
Submitting Change Requests
Evaluating Change Requests
Making and Documenting Changes
Reviewing and Approving Changes
Distributing Revised Work Products and Notifications
2.03 Test Planning and Management
Introduction to Test Planning and Management
Required Background for UAT Planning and Management
Supporting Procedures and Materials
UAT Initiation
Initial Work Plan Preparation
Work Plan Completion, Update and Revision
Developing the UAT Work Breakdown Structure
Test Plan Preparation: Overview
UAT Plan Section 1. Introduction
UAT Plan Section 2. Strategy
UAT Plan Section 3. Work Plan
UAT Plan Sections 4-6. Procedures, Case Design, Test Execution
Test Case, Script and Data Development
Prior Test Results Review
Test Execution Management
Status Reporting
Sample Work Breakdown Structure
2.04 UAT Estimating
Introduction to UAT Estimating

User Acceptance Testing Life Cycle Version 2.0 Contents  2


4/19/04
UAT Estimating Inputs
Developing Phase-by-Phase Estimates: Concepts
Estimating Phase 1: UAT Initiation
Estimating Phase 2: UAT Planning
Estimating Phase 3: UAT Case, Script and Data Development
Estimating Phase 4: UAT Execution
Typical Allocation of Testing Effort
2.05 Requirements Validation and Defect Reporting
Introduction to Requirements Validation and Defect Reporting
Requirements Decomposition
Types of Requirements Defects
Requirements Defect Reporting and Follow-Up Responsibilities
Requirements Defect Reporting Procedure
2.06 Software Migration to UAT and Production
Introduction to Software Migration
Software Migration Procedure
Examples of Software Promotion and Demotion
2.07 User Acceptance Test Execution
Introduction to UAT Execution
Reporting Problems and Change Requests
Reporting User Acceptance Test Results
2.08 Test Results Archive
Test Results Archive Requirements
2.09 Certification Process
Certification Criteria
Certification Procedure
Certification/Non-Certification Memo
2.10 Post Implementation Review
Post-Implementation Review Process

3 FORMS

3.01 Project Initiation and Control


PDLC Checklist
3.02 Documentation Control
Documentation Check List
Documentation Control Log
3.03 Problem Reporting
Requirements Trouble Report
System Trouble Report
3.04 System Approval and Release
Certification/Non-Certification Memo

User Acceptance Testing Life Cycle Version 2.0 Contents  3


4/19/04
4 PROJECT DEVELOPMENT LIFE CYCLE (PDLC)

4.01 PDLC Flow Diagram


PDLC FLOW DIAGRAM Part 1: Procedures 1-5
PDLC FLOW DIAGRAM Part 2: Procedures 6-12B
PDLC FLOW DIAGRAM Part 3: Procedures 12C-17
PDLC FLOW DIAGRAM Part 4: Procedures 18-24A
PDLC FLOW DIAGRAM Part 5: Procedures 25-32
4.02 PDLC Description
PDLC DESCRIPTION Part 1: Procedures 1-5
PDLC DESCRIPTION Part 2: Procedures 6-12B
PDLC DESCRIPTION Part 3: Procedures 12C-17
PDLC DESCRIPTION Part 4: Procedures 18-24A
PDLC DESCRIPTION Part 5: Procedures 25-32

5 ATTACHMENTS

5.01 User Acceptance Test Plan Template and Sample


User Acceptance Test Plan Template
User Acceptance Test Plan Sample

6 GLOSSARY, REVISIONS AND CONCURRENCE

5.01 Glossary
Glossary of Testing Terms
5.02 Revisions
Revision History/Plan
5.03 Concurrence
Concurrence of UAT Management

User Acceptance Testing Life Cycle Version 2.0 Contents  4


4/19/04
1 Overview

1.01 Introduction

Introduction to UAT Procedures

This set of procedures focuses on the primary responsibilities of the User Acceptance Tester and is
intended to be used as a guide. It identifies the top level functions, strategy, requirements and
dependencies of the User Acceptance Testing phase for certification of a release. This process will verify
that the release is reliable and stable. The standards that will be followed to ensure a quality product at
the end of the testing phase are addressed. These procedures address projects only. They are not
intended to cover a process for emergency fixes.

The Introduction chapter provides a summary of all of the material in the order in which it is organized.
The Road Maps, discussed below, provide a summary of procedures and steps in the order in which they
occur in the course of a project.

UAT Project Leader Road Map

The UAT Project Leader Road Map is provided to facilitate the use of the procedures by the individual in
charge of the UAT team for a given project. It provides a step-by-step guide to the entire UAT process,
with cross-references to the procedures covering the tasks for which the project leader is responsible,
and to the PDLC.

UAT Team Member Road Map

The UAT Team Member Road map is provided to facilitate the use of the procedures by each member of
the UAT team for a given project.. It provides a step-by-step guide to the entire UAT process, with cross-
references to the procedures covering the tasks in which the team member participates, and to the
PDLC.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  5


4/19/04
Explanation of Document Organization

The UAT material is organized in the Table of Contents view according to the following examples:

 “1 Overview” is a chapter, the highest level. The Chapter button on each Notes document
selects the beginning of the current chapter in the Table of Contents view, which shows chapters
and sections as categories.

 “1.01 Introduction” is a section, the second level. The Section button on each Notes document
selects the beginning of the current section in the Table of Contents view. The Index view shows
section names as categories. Generally, a section is an entire procedure or other document.

 “Explanation of Document Organization” is a subsection, the third level. Generally, the


documents in the Notes database are at the subsection level. For ease of maintenance,
subsections are not numbered. Subsections are topics within procedures, individual diagrams,
forms, etc.

You can use the "Prev<<" and ">>Next" buttons at the top of any document to go to the previous or next
document in the Table of Contents View. The original document will be closed.

Cross-referenced documents can be opened directly by double-clicking the doclinks placed throughout
the Notes database. For example, this doclink will open the Glossary of Testing Terms. (doclink placed
here.)

When you use a doclink, the original document will stay open, since you will probably continue to read it.
When you close the new document, e.g., by hitting <Esc>, you will return to the original document.

Overview of Functions

A procedure is provided for each of the following major functions. The doclink placed next to each
procedure name will open the first topic in the procedure.

 Documentation Review

UAT is available to review the documentation and provide input with respect to how a solution
can be provided.

 Requirements/Design Change Management

This procedure provides for managing change in the essential documents that provide the
requirements on which testing is based. This procedure covers submitting and evaluating
change requests, making and documenting changes, and reviewing and approving the changes.

 Test Planning and Management

As soon as documents relative to the project are received, UAT will develop the test plan. When
the test plan is approved, the test cases and scripts will be developed. The test plan will address
all aspects of the project request. The plan will serve as a tool to verify that the release satisfies
the approved functional specifications and business requirements document. A different member
of UAT will review the test plan for thoroughness. The procedure also describes how the
progress of the test is tracked.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  6


4/19/04
 UAT Estimating

This procedure details the information on which the estimates are based, the approach to
developing the work breakdown structure (WBS) and the suggested methods for developing
estimates of effort. An overview of the typical allocation of testing effort is provided to be used as
a reasonableness check.

 Requirements Validation and Defect Reporting

An essential stage in test planning is review of the requirements documents. This procedure
provides for structuring the requirements to facilitate the development of test cases and the
verification of requirements coverage. The procedure also provides for reporting of errors,
ambiguities and other issues back to the developers so that problems can be removed from the
system at the earliest possible stage.

 Software Migration to UAT and Production

This procedure specifies that prior to testing, the turnover package must be complete. See the
Documentation Check List form. All required components of the release must be specified in the
Release Information Bulletin as part of the turnover process. Additionally, the unit and system
test plan and test results should be available upon request. The Configuration Manager will
migrate and secure the required components from the Development Library to the UAT Library
for mainframe applications and CCM Plus; as well as build releases of CCM Plus from source
received from Development.

 Test Execution

Testing of the program(s) created or changed for a release will be performed within a UAT test
environment as specified by the Test Execution Procedure. Various types of testing will be
performed to verify the integrity of the software.

 Certification Process

A release memo will be sent to the Lead Project Manager upon the successful completion of
testing, summarizing the test results and the readiness of the system or release for production.

 Test Results Archive

The checklist and/or project plan must be updated to indicate completion of testing. Once the
entire process is completed, the test results will be stored both online and in hard copy form in a
cabinet. An entry will be made in the control log upon removal and return of any hard copy
documentation from the file.

 Post Implementation Review

A review of the software release is conducted by the Lead Project Manager or Implementation
Manager, to determine customer or user satisfaction levels, as described in the Post-
Implementation Review Procedure. It should also provide feedback on improvements to the
current release and/or new enhancement recommendations. UAT will be represented during this
review.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  7


4/19/04
Overview of Forms

The following forms are used in the process.

 Project Request Form

The form used by a project sponsor to initiate projects.

 PDLC Checklist

This form is a reproduction of the entire PDLC Description, enabling the Date Completed or
Explanation to be entered in the last column.

 Documentation Check List

This form is used by the UAT project leader to verify and record that all required documentation
is received.

 Documentation Control Log

This form is used by staff members to record temporary removal of documentation from the
archive.

 Requirements Trouble Report

This Lotus Notes form is used by UAT planners to report problems and change requests at the
stage of requirements review.

 System Trouble Report

This Lotus Notes form is used by UAT testers to report problems and change requests at the
stage of UAT execution.

 Certification/Non-Certification Memo

This form is used by the UAT project leader to report the final results of UAT and the readiness of
the system for Production.

 Release Information Bulletin (RIB)

This form is transmitted by Technology to Production, via UAT, to document the content of each
release.Overview of Project Development Life Cycle (PDLC)

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  8


4/19/04
Overview of Project Development Life Cycle (PDLC)

The PDLC specifies the essential steps and products in the project life cycle, indicating the involvement
of the UAT group. The life cycle is presented in two forms:

 PDLC Flow Diagram

This diagram, divided into five pages, is a graphical presentation showing the involvement of
each organizational function in each life cycle product. Each product is numbered for reference.

 PDLC Summary

An explanation is provided for each page of the PDLC, summarizing for each product the
responsibility of the specified organizational groups, and the rationale and desired output. The
explanation is keyed to the diagram by the product numbers.

Overview of Glossary, Revisions and Concurrence

The final chapter contains three sections:

 Glossary

A glossary of testing terms is provided to facilitate comprehension of the UAT procedures.

 Revisions

These procedures will be re-examined on a quarterly basis for possible updating, and a
description of the latest revision and future revision plans, if known, added in the Revisions
section.

 Concurrence

The procedures are placed in effect by the concurrence of UAT management, as listed in the
Concurrence section.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  9


4/19/04
1.02 Road Maps

Introduction to Road Maps

The UAT Project Leader and Team Member Road Maps are tables that provide a step-by-step summary
of the activities to be performed by the UAT organization for a development or maintenance project,
specify the involvement of the UAT project leader and UAT team member, respectively, in each step, and
provide references to the PDLC and to the UAT methodology documents.

The UAT Project Leader Road Map lists all UAT activities, since by definition all require management
and/or participation by the UAT project leader.

In each Road Map, each row describes a procedure step. The columns are as follows:

 The Step Number and Name column provides a sequential number for convenience of reference,
and a short descriptive name for the step. The name begins with an action verb that indicates
the role of the UAT project leader or team member.

 The PDLC Number column provides a reference to the PDLC procedures. All PDLC procedures
that show participation by Distribution Services/UAT are included. Testing-related procedures
are generally broken out into multiple steps.

 The Description column describes the step and the involvement of the UAT project leader or
team member. If not specified, the leader will choose whether to involve other team members,
depending on project size and other factors. Optional participation is indicated where appropriate
for PDLC procedures shown with dotted lines on the PDLC Flow Diagram.

The Description column may also specify when the step is performed, based on the completion of
other activities within or outside the UAT group. If the timing is not specified, the step is to be
performed as soon as possible after the previous step, or in parallel with it where possible.

 The Reference(s) column provides the number and name of the section or subsection in which
relevant material may be found, particularly for testing-related steps. (See “Explanation of
Document Organization” in Section 1.01.) References to the PDLC Flow Diagram and
Description are provided only at the beginning of the group of steps found on each page to avoid
repetition. These references provide a guide to key parts of the material, however not all topics
are referenced from the Road Maps.

The Notes database is categorized by chapter and section. A name in bold face type after the number is
that of the section corresponding to the number, indicating that the entire section is relevant to the step.
Double clicking the Notes Doclink placed in the Reference column will open the first document in the
section. Most individual documents in the Notes database represent topics within sections, or
subsections. A name in normal type is that of a subsection within the section corresponding to the
number, indicating that the subsection is specifically relevant to the step. The Doclink will open that
document.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  10


4/19/04
UAT Project Leader Road Map

STEP NO. AND NAME PDLC DESCRIPTION REFERENCE(S)


NO. (PROJECT LEADER ROLE) Sec. # Section/Subsection

4.01 PDLC Diagram Page 1


4.02 PDLC Description Page 1
1. Define Concept 1. Optionally participate in gaining
agreement on project goals.

2. Create High Level 1A. Optionally participate in


BRD/PRF development of High Level
BRD/PRF

3. Perform Team 3. Participate in determining


Review priority and timeframes.

4. Conduct Pre-MEP 4. Participate in decision on MEP


Development/ preparation and responsible
Approval Process parties.

5. Develop Detailed 4A. Optionally participate in


BRD development of detailed BRD.

6. Conduct Business 5. Schedule and conduct formal 2.01 Key Documentation


Requirements review of BRD with follow-up Preparation and Review
Document Review review as needed; assure 2.02 Requirements/Design
adequate change manage- Change Management
ment.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  11


4/19/04
4.01 PDLC Diagram Page 2
4.02 PDLC Description Page 2
7. Hold UAT Kickoff 6B. Schedule and conduct UAT 2.03 UAT Initiation
Meeting kickoff meeting with business
sponsor and Technology
project leader when BRD is
approved.

8. Organize UAT Team Determine staffing


requirements, select UAT and
user staff and obtain
commitments.

9. Prepare Initial UAT Prepare UAT team work plan 2.03 Work Plan Preparation
Work Plan (tasks, estimates, schedule) 2.04 UAT Estimating
covering UAT Initiation and
Test Plan Development
phases completely and later
phases at high level (see
below).

10. Provide Project Provide orientation on system 2.03 UAT Initiation


Orientation to Team and project to UAT and user
team members as needed.

11. Initiate UAT Plan Prepare early sections of the 2.03 Test Plan Preparation
User Acceptance Test Plan, 5.01 User Acceptance Test
dealing with testing approach. Plan Template
5.01 User Acceptance Test
Plan Sample
12. Conduct Functional 7. Schedule and conduct formal 2.01 Key Documentation
Specifications review of Functional Preparation and Review
Review Specifications with follow-up
review as needed;

13. Assure 8D. Assure adequate change 2.02 Requirements/Design


Specification control for functional Change Management
Change Control specifications.

14. Validate 8F. Lead team members who will 2.05 Requirements
Requirements and write test cases in preparing Validation and Defect
Functional Requirements Hierarchy, Reporting
Specifications reviewing documents in detail;
reporting problems found.

15. Complete UAT Plan Prepare later sections of the 2.03 Test Plan Preparation
User Acceptance Test Plan, 5.01 User Acceptance Test
specifying builds, test runs, Plan Template
test cases, testing 5.01 User Acceptance Test
procedures. May be done in Plan Sample
parallel with previous step.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  12


4/19/04
16. Submit UAT Plan 8H. Submit UAT Plan for review 2.03 Test Plan Preparation
for Review by users and by a UAT
member not involved in its
development; revise as
needed.

17. Develop MEP 9. Participate in development of


Financial Plan financial plan/justification.

18. Review MEP 10. Participate in review of


Financial Plan financial plan/justification.

19. Extend UAT Work 11A. Extend UAT team work plan 2.03 Work Plan Preparation
Plan (tasks, estimates, schedule) 2.04 UAT Estimating
to cover Test
Case/Script/Data
Development and Prior Test
Results Review phases in
detail.

20. Develop Test Lead team members in 2.03 Test Case, Script and
Cases, Scripts, Data developing new/revised test Data Development
cases and scripts; develop or
obtain test data.

21. Complete UAT Extend UAT team work plan 2.03 Work Plan Preparation
Work Plan (tasks, estimates, schedule) 2.04 UAT Estimating
to cover Test Execution
phase(s) in detail.

22. Submit Test Cases, 11B. Submit Test Cases, Scripts, 2.03 Test Plan Preparation
Scripts, Data for Data for review by users and
Review by a UAT member not
involved in its development;
revise as needed.

4.01 PDLC Diagram Page 3


4.02 PDLC Description Page 3
23. Develop 12C. Participate in development of
Operational new or changed operational
Procedures procedures.

24. Review Technical 12 Participate in review and


Spec., Integrated signoff of “frozen”
Project Plan specifications and project
plan.

25. Review Prior Test 16A. Review results of unit and 2.03 Prior Test Results Review
Results integration testing.

26. Assure Migration to 17. Assure/coordinate migration 2.06 Software Migration to


UAT UAT and Production

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  13


4/19/04
of software to UAT.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  14


4/19/04
4.01 PDLC Diagram Page 4
4.02 PDLC Description Page 4
27. Obtain Audit 18. Obtain audit review of final
Review UAT test plan, if required.

28. Manage UAT 19. Lead team members in 2.03 Test Execution
Execution executing test cases/scripts, Management
reporting and following up 2.03 Status Reporting
problems and change
requests; report results and
status of testing to users and
management.

29. Analyze Test 20A. Assemble and study all test 2.03 Test Execution
Results results; develop Management
recommendations.

30. Identify Errors/ 21. Address and prioritize errors


Issues and issues.

31. Perform Final 22. Retest to confirm resolution of


Regression Tests errors and issues.

32. Archive UAT Assure placement of UAT 2.08 Test Results Archive
Results results in archive.

33. Coordinate User 23. Prepare Certification/Non- 2.09 Certification Process


Acceptance/Sign- Certification Memo; assure
Off follow-up by Technology if
non-certification or rejection
by user.

34. Coordinate Audit 24. Coordinate review of project,


Review/Sign-Off documentation, test results;
demos if needed/appropriate.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  15


4/19/04
4.01 PDLC Diagram Page 5
4.02 PDLC Description Page 5
35. Assure Migration to 25. Assure/coordinate migration 2.06 Software Migration to
Production (full or of software to Production UAT and Production
partial) (partial production if pilot or
parallel test required).

36. Perform 26. Optionally participate in sign-


Changeman off/move schedule/release
Procedures control

37. Manage 30. Lead team members (users) 2.03 Test Execution
Pilot/Parallel Test 30A. in executing pilot and/or Management
Execution where parallel tests, reporting and 2.03 Status Reporting
applicable following up problems and
change requests; report
results and status of testing to
users and management.

38. Assist Rollout to 31. Coordinate/assure migration 2.06 Software Migration to


Full Production (if of software to full Production if UAT and Production
required) and pilot or parallel test has been
Feedback conducted.

39. Assure User Assure/coordinate user


Training training.

40. Conduct Post- 32. After three to six months of 2.10 Post Implementation
Implementation production, represent UAT at Review
Review a formal review of benefits
and “lessons learned,” in
conjunction with business
sponsor and Technology.

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  16


4/19/04
UAT Team Member Road Map

STEP NO. AND NAME PDLC DESCRIPTION REFERENCE(S)


NO. (TEAM MEMBER ROLE) Sec. # Section/Subsection

4.01 PDLC Diagram Page 1


4.02 PDLC Description Page 1
1. Develop Detailed 4A. Participate if assigned in
BRD development of detailed BRD.

2. Conduct Business 5. Participate if assigned in 2.01 Key Documentation


Requirements formal review of BRD with Preparation and Review
Document Review follow-up review as needed. 2.02 Requirements/Design
Change Management
4.01 PDLC Diagram Page 2
4.02 PDLC Description Page 2
3. Organize UAT Team 6B. Review schedule/availability 2.03 UAT Initiation
with project leader.

4. Prepare Initial UAT If assigned, help prepare UAT 2.03 Work Plan Preparation
Work Plan team work plan (tasks, 2.04 UAT Estimating
estimates, schedule) covering
UAT Initiation and Test Plan
Development phases
completely and later phases
at high level (see below).

5. Receive Project Receive orientation on system 2.03 UAT Initiation


Orientation and project as needed.

6. Initiate UAT Plan If assigned, help prepare early 2.03 Test Plan Preparation
sections of the User 5.01 User Acceptance Test
Acceptance Test Plan, Plan Template
dealing with testing approach. 5.01 User Acceptance Test
Plan Sample
7. Conduct Functional 7. Participate if assigned in 2.01 Key Documentation
Specifications formal review of Functional Preparation and Review
Review Specifications with follow-up
review as needed;

8. Assure 8D. If assigned, help assure 2.02 Requirements/Design


Specification adequate change control for Change Management
Change Control functional specifications.

9. Validate 8F. If assigned, participate in 2.05 Requirements


Requirements and preparing Requirements Validation and Defect

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  17


4/19/04
Functional Hierarchy, reviewing Reporting
Specifications documents in detail; reporting
problems found.

9. Complete UAT Plan If assigned, help prepare later 2.03 Test Plan Preparation
sections of the User 5.01 User Acceptance Test
Acceptance Test Plan, Plan Template
specifying builds, test runs, 5.01 User Acceptance Test
test cases, testing procedures. Plan Sample
May be done in parallel with
previous step.

10. Submit UAT Plan 8H. If assigned, help revise UAT 2.03 Test Plan Preparation
for Review Plan as needed.

11. Extend UAT Work 11A. If assigned, help extend UAT 2.03 Work Plan Preparation
Plan team work plan (tasks, 2.04 UAT Estimating
estimates, schedule) to cover
Test Case/Script/Data
Development and Prior Test
Results Review phases in
detail.

12. Develop Test Develop new/revised test 2.03 Test Case, Script and
Cases, Scripts, Data cases and scripts; develop or Data Development
obtain test data.

13. Complete UAT If assigned, help extend UAT 2.03 Work Plan Preparation
Work Plan team work plan (tasks, 2.04 UAT Estimating
estimates, schedule) to cover
Test Execution phase(s) in
detail.

14. Submit Test Cases, 11B. Revise test cases, scripts, data 2.03 Test Plan Preparation
Scripts, Data for as needed.
Review

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  18


4/19/04
4.01 PDLC Diagram Page 3
4.02 PDLC Description Page 3
16. Develop 12C. Participate if assigned in
Operational development of new or
Procedures changed operational
procedures.

17. Review Technical 12 Participate if assigned in


Spec., Integrated review and signoff of “frozen”
Project Plan specifications and project
plan.

18. Review Prior Test 16A. Review results of unit and 2.03 Prior Test Results Review
Results integration testing.

19. Assure Migration to 17. If assigned, help assure/ 2.06 Software Migration to
UAT coordinate migration of UAT and Production
software to UAT.

4.01 PDLC Diagram Page 4


4.02 PDLC Description Page 4
20. Perform UAT 19. Execute test cases/scripts, 2.03 Test Execution
Execution reporting and following up Management
problems and change 2.03 Status Reporting
requests; report results and
status of testing to UAT
project leader.

21. Analyze Test 20A. Participate in assembling and 2.03 Test Execution
Results studying all test results; Management
developing recommendations.

22. Identify Errors/ 21. Participate in addressing and


Issues prioritizing errors and issues.

23. Perform Final 22. Retest to confirm resolution of


Regression Tests errors and issues.

24. Archive UAT Place UAT results in archive. 2.08 Test Results Archive
Results

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  19


4/19/04
4.01 PDLC Diagram Page 5
4.02 PDLC Description Page 5
25. Assure Migration to 25. If assigned, help 2.06 Software Migration to
Production (full or assure/coordinate migration of UAT and Production
partial) software to Production (partial
production if pilot or parallel
test required).

26. Perform 26. If assigned, participate in


Changeman sign-off/move
Procedures schedule/release control

27. Perform 30. Participate/assist users in 2.03 Test Execution


Pilot/Parallel Test 30A. executing pilot and/or parallel Management
Execution (if tests, reporting and following 2.03 Status Reporting
required) up problems and change
requests; report results and
status of testing to project
leader.

28. Assist Rollout to 31. If assigned, help coordinate/ 2.06 Software Migration to
Full Production (if assure migration of software UAT and Production
required) and to full Production if pilot or
Feedback parallel test has been
conducted.

29. Assure User If assigned, help coordinate


Training user training.

30. Conduct Post- 32. After three to six months of 2.10 Post Implementation
Implementation production, participate if Review
Review requested in formal review of
benefits and “lessons
learned.”

User Acceptance Testing Life Cycle Version 2.0 Chapter 1: Overview  20


4/19/04
2 UAT Functions

2.01 Documentation Review

Required Documentation

In order to create test plans and cases, specific documentation is required and used as a guideline. The
User Acceptance Testing Director should receive a copy of the following documentation for distribution
and review when applicable. See the Documentation Check List form.

 Project Request Form (PRF)


 Business Requirements Document (BRD)
 Functional Specifications
 Technical Specifications
 Program Specifications
 Specifications for Interdependent Systems
 Unit Test Plan/Cases/Scripts
 Unit Test Results
 System Test Plan/Cases/Scripts
 System Test Results
 Programs/JCL, etc. (placed in UAT libraries)
 Runbook or Operational Documentation
 User Documentation

Key Documentation Preparation and Review

Documentation Responsibilities

A development or maintenance project is managed by three “partner” organizations - the Business


Sponsor, the Technology group, and Distribution Services/UAT. All of these organizations are

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  21
4/19/04
responsible for the quality of the key documents that specify the content of the project, i.e., the new
and/or changed system functionality.

 Responsibility for preparing the Business Requirements Document lies with the Business
Sponsor, who assigns the preparation of the document to an appropriate individual(s).

 Responsibility for preparing the Functional Specifications lies with the Technology group. The
Technology project leader will prepare this document or assign the task to an appropriate
individual(s).

 Responsibility for reviewing these key documents lies with all three groups, coordinated by the
UAT project leader.

Scope of Reviews

It is important to understand that most projects consist of modifications and/or enhancements to existing
systems. These systems should already have requirements and functional specification documents
covering the pre-existing functionality.

It is not intended that these documents should be reviewed in their entirety in each project. The
documents should, however, be updated to reflect the new and changed functionality. See Section 2.02,
the Requirements and Design Change Management Procedure.

The formal reviews cover work in progress, i.e., the changes and additions to be introduced in the course
of the specific project. These should be documented and presented in such a way as to facilitate
understanding the impact on the existing system and integrating the changes into the existing
documentation.

Conducting the Reviews

For key documentation items, a formal review process will be conducted in accordance with the PDLC
with the participation of representatives of the Business Sponsor, Technology and Distribution
Services/UAT.

The Business Sponsor or Lead Project Manager coordinates the review meeting, obtains the participation
of specific individuals from each group, and assures that the document authors distribute the document
well in advance to each individual participant. The UAT Project Manager will originate a meeting with the
Business Sponsor or Technologist to obtain clarification or direction, and subsequent meetings as
needed.

Making and Reviewing Revisions

Making revisions to any document is normally the responsibility of the author(s), unless otherwise
assigned.

Once the revisions have been made, the original reviewers, or at minimum a representative of each of
the three “partners,” meets again to review the revisions. This meeting may be convened and
coordinated by the any of the “partners.”

Documentation Review Criteria

The participants in the review, including representatives of all three “partners,” review the documents and
evaluate them. This is an overall quality review that is not intended to substitute for the detailed

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  22
4/19/04
validation that occurs during test planning. The following thought provoking questions may be helpful.
The word “requirements” is used generically to refer to both of the key documents.

1. Are these the “right” requirements? Is this the optimum method to produce the desired results?

2. Are the requirements complete? Does the design specify all relationships between modules, how
they pass data, what happens in exceptional circumstances, what starting state should be
assumed for each module? In particular, do the functional specifications completely and correctly
cover the requirements?

3. Are the requirements compatible with other systems?

4. Does the user understand the requirements?

5. Are the requirements achievable? Do they assume the hardware works more quickly than it
does? Do they require too much memory or too many I/O devices?

6. Are the requirements reasonable? What are the tradeoffs between development speed,
development cost, product performance, reliability and memory usage. Have these issues been
considered?

7. Are the requirements testable? How easy will it be/how easy is it to tell whether the design
documents match the requirements?

8. How well is error handling covered? Is it handled at the correct level before the user is forced to
backtrack, cancel a module, or correct work completed?

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  23
4/19/04
2.02 Requirements/Design Change Management

Introduction to Requirements and Design Change Management

Changes to systems fall into two major categories. Some are intended to remove defects, and are
classified as problems. All other changes are classified in this organization as enhancements, since they
are intended to make the system more useful to the business. Change management covers the
processing of both types of change requests, at any point in a development or maintenance project.

A disciplined project life cycle such as the PDLC requires that deliverables, or work products, be created
in a specified sequence, each based on its predecessors. Subsequent changes must first be introduced
into the earliest affected product and then propagated into later products in the same sequence.
Therefore, when a change is being made, the impact on all work products relating to the system must be
considered.

Some problem reports indicate an error in a specific product, thus earlier products would be unaffected.
Some technical, or detailed, changes, resulting from either problem reports or change requests, might not
affect the requirements or high-level design. However, once a product is found to be affected, all later
products for which it is an input must be examined and revised appropriately.

The formal change management process applies to any approved version or release of a work product,
not to the initial creation of the work product, which is based on its predecessors. Revision of an
approved work product consists of the following major activities:

 Submitting change requests

 Evaluating change requests

 Making and documenting changes

 Reviewing and approving changes

 Distributing revised work products and notifications.

A maintenance project, or an additional release within a project, requires revisions to existing work
products. The first two activities above, as described in this procedure, do not apply to such revisions,
since they are requested and evaluated as new requirements.

Also, when a formal review of a work product such as a Business Requirements Document (BRD) is
conducted, the findings of the review are taken as approved change requests, covering the first two
activities. They are documented as a group and do not need to be submitted as individual requests.
Once these have been incorporated and the follow-up review has been passed, the work product is
considered approved. Thus, formal change management begins in the PDLC when the BRD, the first
detailed work product, has been reviewed and approved.

Procedure Scope: Types of Work Products

Some aspects of making a change are dependent on the type of work product and the software
environment in which it is stored. Many work products are produced as documents, and may be stored in

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  24
4/19/04
word processing files, online databases, or a document change control tool. The documents may have
attachments that are stored using other software such as project, or test, planning and management
tools. There are several types of document work products.

 Requirements and design documents enable the development of subsequent work products and
provide information to future system maintainers.

 Documents such as user manuals enable the delivered system to be used effectively and also
provide information to future maintainers.

 Documents such as project and test plans enable the development and test project leaders to
control the project.

Requirements and design documents are the focus of this procedure. Other documents can be
maintained using a similar process.

Non-document work products will also require modification as a result of requirements and design
changes.

 Program code and similar products, such as JCL and database specifications, are processed by
software such as compilers, and become physically part of the system.

 Test cases and test scripts are stored as pre-formatted online documents, to be executed
manually by testers, or as test tool scripts to be executed automatically by the tool.

 Data tables and files, both test and production versions, are stored by the operating system or
DBMS and processed by the system itself.

Because of the different environments in which work products may be stored, this procedure does not
contain all the physical details of recording and documenting the change.

Submitting Change Requests

In general, a requester submits a change request, classified as a problem or enhancement. (See the
Introduction to Requirements and Design Change Management.)

The requester assigns the request to the appropriate technology area, which may reassign it if
necessary. Requesters fall into several categories independent of whether the requests are problems or
enhancements.

 Work product reviewers submitting requests individually or on behalf of a group, e.g., a group of
walkthrough participants.

 Test planners performing requirements validation (see the Requirements Validation and Defect
Reporting procedure).

 Developers using a work product as input in producing a subsequent product, e.g., using a
design document in developing a source program.

 Developers, testers and users performing test execution.

 Users submitting requests for any other reasons during the course of a project.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  25
4/19/04
Requests should be submitted using the online Requirements Trouble Report form in Lotus Notes
wherever feasible. Requests not submitted in this form must be documented in the online database as
soon as possible. (See the topic, Evaluating Change Requests.)

Appropriate views, e.g., based on the current status of the request, will allow each individual concerned to
view requests as a group, track status and take appropriate action. This makes it generally unnecessary
to e-mail the requests. Particularly urgent requests, or follow-up queries, can be e-mailed.

Evaluating Change Requests

STEP 1: Receiving and Assigning the Request

The project leader, or other designated change request liaison, must be aware of incoming change
requests and be sure they are documented and evaluated appropriately.

 For requests submitted through the designated online database, the liaison should check the
database at least once daily, more often as the volume of changes grows and a project moves
toward completion.

 For requests received by e-mail, fax or phone, the liaison must create a request document
immediately in the designated online database.

If a request is incomplete or unclear, the liaison should return it immediately via the online database, e-
mail (pasting the form into the message) or fax (in order of preference), indicating the specific areas or
issues to be clarified. Database transmission is done by simply updating the online form to indicate
“clarification needed,” which should also be done before sending the form by e-mail or fax.

In principle, responsibility for a request requiring clarification rests with the requester at this point.
However, the liaison should follow up requests that raise important issues when clarification is not
received promptly.

The liaison assigns an accepted request to an evaluator, normally the author of the first work product
likely to be affected, to evaluate the validity of making the change, as well as the potential impact on the
project and the system itself and the resulting benefit. (Steps 2 and 3 below.). In a small project, or if
otherwise appropriate, the liaison may perform one or both of these steps as well.

STEP 2: Assessing the Validity of the Request

Bundling changes into versions or releases will reduce the cost per change and generally lead to greater
accuracy. The changes can then be evaluated as a group rather than individually.

If the requester cannot completely describe the suggested correction or change (e.g., is not familiar with
system internals), the evaluator should complete the description, consulting with the requester and others
as needed. The evaluator assesses the validity of the request based on the following factors:

 For a problem, Is the reported defect actually present, as claimed? (See Section 2.05, the
Requirements Validation and Defect Reporting procedure for a discussion of types of defects in
requirements documents.) Will the proposed correction remedy the problem and meet the
criteria below for enhancements, where applicable?

 For an enhancement, Is the requested change technically feasible, consistent with the overall
design philosophy or architecture and the applicable standards, appropriate in terms of

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  26
4/19/04
functionality or performance from the user perspective (which may require user input), compatible
with other changes being made, and overall the best way of accomplishing the desired objective?

The assessment, and any information added as above, should be documented on the change request
form in the database, including the reason(s) if the request is considered invalid. If invalid, the request is
returned to the requester.

The requester may accept the assessment, discuss the request with the evaluator, resubmit the request
with modifications, or escalate the disagreement. It is the requester’s responsibility to follow up requests
and know their status, by periodically checking an appropriate view of the database.

STEP 3: Evaluating the Impact and Value of the Change

If the request is assessed as valid, the next step is to assess the impact and net value of the proposed
change. Some requests, e.g., those involving incomplete or inconsistent information in a requirements
document, must be responded to in order for the project to proceed. Others, including some in the
problem category, may need to be deferred or rejected, e.g., if the system has been “frozen.” The factors
listed below should be evaluated for both problems and enhancements.

 What deliverables must be changed? Changes in requirements and design will generally impact
all later work products.

 What will changing all of the work products add to the project’s resource requirements, cost and
duration, and what is the potential benefit?

 To what extent will the project’s risk be increased or decreased? Making any change creates an
a priori risk of unintended changes or side effects.

Users should be consulted when the cost, schedule or risk impact is significant, even if the benefit
appears commensurate.

The evaluator notifies the requester of the disposition of the request via an entry on the online document.
The requester can respond as above, in the case of disagreement. If accepted by the evaluator, the
concept of the request is considered approved. However, approval of the revised work product(s) will be
needed to assure that the concept has been realized. The evaluator specifies the required approvals for
each work product as discussed below. The request is then normally assigned to the author of the first
affected work product to make the changes and notify any authors of other affected work products.

Making and Documenting Changes

As noted in the topic, Procedure Scope: Types of Work Products, earlier in Section 2.02, there are
several types of work products. All are stored on-line in some form and most, including source programs,
are edited directly to make changes. However, changing data tables and files, particularly those used for
testing, is frequently overlooked when requirements or design are changed, and will generally need to be
done using utilities, test tools or custom file maintenance routines. Such routines may be part of the
system being developed or may need to be developed along with the system.

Changes must be documented, whether initiated as new requirements or in response to change requests.
Documenting changes means creating an audit trail, i.e., insuring that the nature and location of the
changes, and the reason(s) for making them, are visible to those using the work product in the future.
The approach depends on the type of work product and the medium in which it is stored.

For requirements and design documents, the following considerations apply:

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  27
4/19/04
 A change log at the beginning or end of the document should briefly summarize each set of
changes, giving the number(s) of the change request(s), date, version/release number, name of
the person making the change(s), and a brief summary of the changes and reasons for making
them.

 It is helpful, for major changes that take significant time to accomplish, e.g., new user
requirements or large groups of change requests, to list changes in progress and where possible
future changes at the end of the change log with estimated completion dates and expected
version numbers when known. With little alteration, “future changes” become “changes in
progress,” which become the records of the changes once finished.

 Concurrent or parallel update should be avoided wherever possible. When the document is
physically open for update, it should be locked for update by others where the environment
permits, or separate working copies provided and final update coordinated.

 To the extent possible in a given software environment, without making the document confusing,
it is essential to indicate where text has been changed or added. The minimum requirement is
the vertical margin bar or equivalent covering all new or changed lines, including lines where text
was deleted. This facilitates both use and review of the document.

 Use of version numbers (m.n) promotes bundling of changes and facilitates notification and
awareness of changes and is particularly appropriate in a formal library maintenance
environment, e.g., Changeman, where updates are not instantaneous.. Even if changed work
products are immediately accessible on-line, a process of constant ongoing change makes
notification and control impossible and is not desirable.

 Note that in a release management environment, it is best to use the same version number for all
work products in a release. A suffix (m.n.A, etc.) can be added if it is necessary to have multiple
versions of some products for the same release.

 Each version should be archived, or made reproducible via a library or change management tool.
The former alternative is simpler and more reliable.

Reviewing and Approving Changes

The objectives of review and approval of completed, as opposed to proposed, changes are as follows:

 To assure that changes made in response to change requests fulfill the requests once they have
been clarified and completed, and that no unintended changes have been made.

 To assure that the changes are consistent with changes made in previous work products.

 To assure that all changes, once completed, meet the validity criteria for enhancements as
described in Step 2 above, given that some implementation aspects may be determined as the
change is made.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  28
4/19/04
Review and approval have the following relationship:

 Approval always implies review - no one should approve a changed work product without
reviewing the new and changed material. Specified approvals are required steps in producing or
revising a work product.

 Review does not always imply approval - a peer may review a work product to provide helpful
input without having approval authority. Such reviews are optional.

The required approvals depend on the magnitude of the changes being made at a given time, whether in
response to change requests or new requirements. The approvals will not normally exceed those
required for the original work product.

When changes to requirements, design, or other document work products are completed, the author or
other reviser increments the whole or fractional part of the version number, or the suffix, depending on
the magnitude of the changes, and submits the document as a draft for the approvals specified when the
revisions were assigned. Comments should be provided, and responded to, as quickly as possible.

Version numbers should be used meaningfully in that a change in the whole number part of the version,
as opposed to the decimal part, implies major changes that would call for reviews comparable to the
initial reviews of the product.

Given that anyone dealing with a system can request a change, every requester is not automatically
included as an approver. The requester may be included for major or complex changes to assure that
the concept is correctly realized, or “ex officio” if the individual has a major role in the project, e.g., the
user liaison. Such an individual would also approve changes requested by others. However, the
requester alone cannot approve a change - a separate sponsor is needed. Once notified, any requester
or project sponsor can resubmit the request with an appropriate notation if not satisfied with the planned
changes.

The names of the approvers are listed at the beginning of the document, e.g., on the title page, or else at
the end.

Distributing Change Notifications and Revised Work Products

Objective

The objective of this activity is to be sure that all users of revised work products are both aware of and
using the current version of the product. Versions lacking specified approvals may be distributed for
review or urgent use marked “DRAFT,” without replacing the current approved version.

Distributing Change Notifications

Everyone on the distribution list for a work product should be notified of the publication of a revision,
preferably by electronic mail. Citimail should be used for anyone who is not on e-mail (Notes).
Individuals who receive hard copy distribution of a product should be notified so that they can request
another copy if it does not arrive, and examine the changes when it does arrive. Those who have on-line
access to the product should be notified so that they can examine the changes online.

The project should maintain a control list of the current versions of work products, with the latest revisions
to the list marked as in any other document (e.g., by margin bars). This can ideally cover all work
products, but at a minimum should cover all document work products. Depending on the document

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  29
4/19/04
storage environment, the list might be stored and updated automatically, or generated on request as a
pre-formatted report.

This list should be included with hard copy distributions of any document, and available on-line to those
with access to the document storage environment. The list itself can be sent as the notification.

Distributing Revised Work Products

The reviser sends the revised work product to the same distribution list as the previous version. The
distribution list should be included at a standardized position in the document. The names in the list
should be reviewed and changed as appropriate as project staffing changes, and the appropriate medium
(hard copy, online database, or other electronic medium) and address for each person, which depends on
the type of deliverable, may also change.

Where an addressee has access to the product in its normal on-line medium, generally no actual
transmission will be needed. Electronic access is preferable to hard copy distribution, and the practice of
distributing revision pages to be inserted by the recipient should be avoided.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  30
4/19/04
2.03 Test Planning and Management

Introduction to Test Planning and Management

For even a medium sized project, testing activities usually represent a significant portion of the overall
project effort, involving several staff members working in parallel. Planning and managing these activities
effectively is a complex undertaking, and requires rigorous discipline and the application of proven
techniques.

Each level of testing must be planned and managed using the most appropriate testing-related
procedures. User Acceptance Testing is the key, final testing phase for systems development and
maintenance projects in Global Cash Management Services. The UAT organization is separate from the
Technology, or development organization, with its own management structure. The UAT organization
plans and manages the UAT phase of each project as an independent project in itself.

Key UAT activities covered by this procedure are:

 UAT initiation
 Test plan preparation
 Work plan preparation
 Test case, script and data development
 Prior test results review
 Test execution management
 Status reporting.

Test planning includes all but the last two activities, as well as requirements identification and validation,
which is covered in the Requirements Validation and Defect Reporting procedure. While the work plan is
presented as an attachment to the test plan, it is discussed separately in this section since it requires
general management disciplines and the use of a standard project planning tool.

Key to testing management is the gathering of information on the status of tasks in progress, and taking
proactive and corrective action to keep the project on track. An important aspect of status reporting is the
gathering of statistics that will assist the planning of future projects.

Required Background for UAT Planning and Management

While the testing-related procedures found in this chapter, and the UAT Test Plan Template and Sample
found in Chapter 5, Attachments, detail the specific steps to be followed, effective test planning and
management require several types of background:

1. Knowledge of, and experience with, testing concepts and techniques, particularly those relevant
to UAT.

2. Knowledge of, and experience with, general planning and management techniques, including
ideally the use of MSProject.

3. Familiarity with the development, user and UAT organizations’ structure, staffing and procedures.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  31
4/19/04
4. Familiarity with the system being tested and the objectives of the current development or
maintenance project.

The first three are prerequisites for leading UAT projects. The fourth is acquired during the early phases
of the UAT project itself, as well as during any previous projects dealing with the same system.

Supporting Procedures and Materials

Guidance and support for the UAT project leader in performing the activities described in this procedure
are found throughout the UAT Life Cycle material, particularly in the following procedures and
attachments:

 The UAT Project Leader Road Map in Section 1.02, Road Maps, provides a high-level overview
of the sequence of activities in which the project leader participates.

 The Sample Work Breakdown Structure (WBS) at the end of Section 2.03, Test Planning and
Management, provides a template for the Work Plan section of the UAT Plan, indicating the
activities to be carried out and managed, as described in this procedure. The high-level activities
are named uniquely, e.g., with “UAT” in the name, so that the UAT plan could folded into the total
project plan.

 The UAT Test Plan Template in Chapter 5, Attachments, enables a UAT plan to be created by
filling in specified information in a copy of the Template document.

 The UAT Test Plan Sample in Chapter 5, Attachments, is an actual test plan in the format of the
Template, to further clarify the use of the Template.

 Section 2.04, UAT Estimating, provides guidance in estimating the effort for UAT planning and
execution.

This section links this material together, providing a step-by-step approach to carrying out and managing
the tasks listed in the WBS.

UAT Initiation

The UAT project leader will obtain orientation on the project prior to UAT initiation via the High-Level
Business Requirements Document. Further detail will be obtained subsequently by participating in the
review of the Business Requirements Document (BRD). and, later, the Functional Specifications.
Orientation on a pre-existing system, if it is unfamiliar, should be obtained by reading prior
documentation, or with the sponsor and Technologist when documentation is not available.

The first activity to be performed in the UAT initiation phase is to schedule and conduct a kickoff meeting
involving the three key players:

 The business sponsor of the project or lead project manager


 The lead member of the Technology (development) team
 The lead member of the UAT team.

This will enable all concerned to know and agree on the nature, magnitude and expected timing of the
development or maintenance project and the names and roles of all those involved.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  32
4/19/04
The next activity is to organize the UAT team.

1. Determine the required staffing level and background based on the above information.

2. Determine the availability and obtain the commitment of appropriate staff for the anticipated time
period. If outside staff are to be used, lead time is generally required to approve their use and to
select the vendor and individuals.

3. Make contact with the user organization, to determine whether any additional individuals will be
members of the UAT team and the extent of their availability, as well as to establish any other
user contacts, particularly for system acceptance. The person who will accept the system by
definition cannot be a member of the test team.

The next activity is to prepare the initial UAT work plan, as discussed in the next topic, Initial Work Plan
Preparation.

The next activity is to provide orientation to the team on the system, if pre-existing, and the current
project. It will be helpful to ask the development staff to make a presentation to the team. For existing
systems, users may not require this orientation, but may require orientation in testing techniques.

The orientation should be provided as close as possible in time to the start of work on the project by the
team members, who will generally fall into two groups. Early test planning implies that there will be a time
interval between UAT planning and UAT execution. Thus, any team members besides the project leader
who will be involved in test planning or test case development need orientation earlier than those who will
be involved only in test execution.

Initial Work Plan Preparation

Preparation of the initial Work Plan should be done as early as possible because without a plan the
project is not under control and there is nothing against which to report status. In particular, test planning
is a major part of the project and is to be done under the control of the Work Plan.

The Work Plan is based on a Work Breakdown Structure (WBS), i.e., a hierarchical breakdown of tasks,
to which estimates of effort, dependencies, and start/end dates are added. See the topic, Developing the
UAT Work Breakdown Structure.

To produce the Work Plan, the test planner

 Lists the milestones and tasks (the WBS) and the resources.

 Inputs any required dates and dependencies for the tasks and milestones, including the
anticipated dates for external milestones.

 Estimates the effort for each task, as discussed in the Section 2.04, UAT Estimating.

 Assigns resources to tasks based on the skills and availability of the resources and the estimated
effort for each task.

With this information, the planner completes the project schedule, a task which is assisted by the tool.

The initial plan should be prepared as soon as the project leader has obtained orientation and organized
the team. Even if these startup activities have been completed when the plan is prepared, their inclusion
in the plan is necessary so that the plan will reflect the total effort and duration of the project.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  33
4/19/04
The initial work plan should contain the following:

1. The complete plan for the UAT Initiation, UAT Planning and Prior Test Results Review activities,
subject to later modification if needed.

2. A high-level plan for the Test Case, Script and Data Development, and UAT Execution activities,
reflecting as much as is known or can be estimated at this time. While the build structure, for
example, will not yet have been defined, it may be known whether there will be parallel or pilot
tests, and which of the setup activities will be needed at each test level.

3. A preliminary estimate for the Management and Status Reporting effort, reflecting the size and
duration of the project.

In MSProject terms, the initial work plan is a partial baseline plan for the project. This plan will provide an
initial basis of communication among the team, and with UAT and user management. Status reports
published during test planning (see the Status Reporting topic) should be based on the initial work plan.

Work Plan Completion, Update and Revision

The work plan is completed when the test plan is substantially complete, as discussed below. The test
runs and test plans must be laid out in order to complete the work breakdown, estimates and schedule.
However, detailed information needed for test execution that does not impact the work plan, e.g., Section
6 of the Test Plan, can be completed in parallel with the work plan.

The normal updates to the work plan are the hours spent on each task, the percent completion, and the
actual completion date, when finished.

The baseline work plan for the project should not be changed. However, it may be appropriate to prepare
revised plans as circumstances change. A plan should not be automatically changed whenever the
project deviates from the plan, since this will occur to some degree most of the time, and the value of the
plan will be destroyed. However, the current plan should reflect actual, realizable intentions, thereby
providing useful information. Reasons for departing from the baseline plan should always be
documented.

Developing the UAT Work Breakdown Structure

A sample work breakdown structure (WBS), or hierarchy of tasks, is provided at the end of the Test
Planning and Management section. In the sample WBS, the tasks are presented in logical order of being
started. Milestones, indicated by (M), in some cases represent completion of activities external to the
work plan; all other items relate to activities within UAT. The external milestones allow for documentation
of key predecessors of internal tasks and remind test planners to find out key external completion dates.

The tasks indicated are a guideline; not every task is required in every case. As noted previously in this
section, while allowance is made for normal UAT, Parallel and Pilot test levels, all three are rarely, if ever,
included in a project. In particular, non-software deliverables would generally be tested at only one of
these levels. As a convention, the lower level tasks, from which the estimates and schedule must be built
up, are written using active verbs although in some cases planners may break them out further.

It is recommended that tasks be broken down at least until each task at the lowest level can be
performed by a single person in approximately 40 hours of effort or less. While some tasks already listed
may be smaller than 40 hours, they should generally be retained to be sure they are included. In most

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  34
4/19/04
cases, the nature of the tasks will make a more detailed breakdown advisable. However, it is suggested
that tasks be broken down to less than 7 or 8 hours of effort only if they are different from each other in a
significant way.

It is also recommended that tasks be broken down at least until each can be assigned to a single person.
This simplifies estimating, because the effort of only one person, with a single skill set and hourly cost (if
cost is to be calculated) needs to be estimated for each task. It also makes it easier to manage the
project since there is only one person to deal with for each task.

Using an appropriate level of detail makes the estimates of effort and schedule more reliable. It also
provides the ability to track the tasks effectively once they are in progress, e.g., to know why a project is
behind schedule. However, an excessively detailed breakdown is likely to add more “noise” than
information to the estimates and will make the tracking process unduly complex. At the detail level, tasks
are listed more to track completion than to track level of effort, since variances should average out as the
tasks are aggregated.

In completing the WBS, the test planning tasks may not require additional detail beyond what is contained
in the outline. However, the total set of test case development and execution tasks can only be known
once the requirements have been decomposed and the hierarchy of builds, test runs and test cases
determined. This is the major reason for preparing the work plan and estimates in phases.

Test Plan Preparation: Overview

In the UAT Plan Template, descriptions of the information to be filled in are enclosed in braces: {...}.
Optional items are enclosed in brackets: [...]. The items in braces should be deleted as the actual
content is filled in. Some of the items are explanations that are removed without being replaced. The
items in brackets should be deleted if not used; if used, the brackets should be deleted.

The Edit/Find command can be used to locate the bracket and brace characters to be sure that no items
are missed. However, the test planner should read all of the template text, since it may be necessary to
modify some of it.

The material in the template is presented in a sequence that is logical for readers of the plan, once it is
completed. While the sections are covered in this section in the same sequence for ease of reference,
the information in the Test Plan should be completed in the order in which it becomes available, as
discussed in the next topics. Thus it is best to read this procedure completely before beginning the plan.

The next step is to make a copy of the Template, and review the entire document, spending appropriate
time depending on prior familiarity.

Note that user participation in developing the test plan should be sought to the maximum extent possible.
At a minimum, The test plan must be reviewed by the sponsor and users prior to being finalized.

UAT Plan Section 1. Introduction

To start the test plan, the Introduction section is completed, based on the initial orientation to the project,
and the remainder of Section 1 filled in.

 Section 1.1, Document Structure, should be completed as initially planned, then revised as
needed when the remainder of the document is completed.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  35
4/19/04
 Section 1.2, Purpose and Scope..., should now be completed.

 Section 1.3, Assumptions, is extremely important. All assumptions on which the work plan and
the entire test plan depend must be listed. The accuracy of all assumptions, in so far as can be
known, should be confirmed. However, the major reason for listing assumptions is that they
describe circumstances that may change, which would significantly impact the project. Additional
assumptions may come to light as the plans are completed.

 Section 1.4. Risk Assessment, should include the potential impact if assumptions are violated.
However, not every risk necessarily corresponds to an assumption - every significant potential
risk should be listed. While numerical estimates are not required, the severity of a risk is
conceptually its likelihood of occurrence multiplied by the impact, e.g., in dollars, if it does occur.
Note that most risks are independent of each other, and the likelihood that at least one of the risk
factors will actually occur increases with the number of risks, and will exceed the likelihood of any
single risk occurring.

UAT Plan Section 2. Strategy

Section 2, Strategy, provides the basis for development of the test scripts and cases and is the key to
applying a structured testing methodology.

 Section 2.1, Test Levels, should be completed as early as possible, in consultation with the
users, since it will affect the total work plan, completion dates and required facilities. Both
parallel and pilot testing provide increased confidence in installing a high-risk system or release,
but parallel testing is possible only when there is a comparable system or manual process in
place. Both will rarely be performed in the same project.

 Section 2.2, Types of Tests, may be deferred until the requirements have been fully examined.
Environment tests are particularly relevant in the case of a new environment (technology
platform), or multiple environments. Positive and negative functional tests, invalid input tests,
and usability tests should always be included. Control, security, and capacity/performance tests
will be required if there are requirements in these areas, as there should be for new systems.

 Regression tests will always be required. Those verifying that errors found during the test
execution phase have been correctly fixed without introducing new errors cannot be specifically
anticipated. However, in a maintenance project, tests verifying that functions already in
production still work after the specified changes are made can be planned in advance. These
test cases should be selected, where possible, from those used on the system in previous
projects.

Section 2.3, Requirements Identification, asks for a list of source documents, which should be filled in
immediately and updated as new versions are released and the requirements hierarchy is updated.

The requirements hierarchy will generally be an attachment due to its size and the fact that it may be
stored in a different environment. See the discussion under “Requirements Decomposition” in the
Requirements Validation and Defect Reporting section.)

Section 2.4, Requirements Coverage Strategy, essentially asks two questions:

1. How will the testing effort be allocated among the different requirements areas?

2. How can the test planner be confident of having achieved the specified coverage when actually
writing test cases?

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  36
4/19/04
It is not necessary to complete the requirements hierarchy in order to answer these questions. Rather,
the approach to verifying coverage may impact the decision on how the requirements hierarchy will be
prepared and stored.

Section 2.5, Build and Test Run Strategy, is key to achieving and verifying the desired requirements
coverage. As noted in the Template, builds are the highest level grouping of test cases. Depending on
how a development or maintenance project is structured, builds may correspond to any of the following:

 Physical portions of the system (e.g., groups of software modules), or non-software deliverables
such as a user manual, whether delivered by the developers one at a time or all together. If the
system spans environments, e.g., mainframe and workstation, separate builds should be defined
for each.

 Discrete areas of functionality such as access validation or reporting.

 Discrete types of tests such as capacity/performance.

The build structure may be the same as that used in integration testing, and may correspond to
production releases, in which case it may be maintained in parallel and/or pilot testing.

Delivering the system to UAT build by build allows testing to be overlapped with development, reducing
the total project duration. However, such a strategy must be coordinated with the developers. If the
system is delivered to UAT all at once, the choice of build structure is fairly open, but it is generally best
for the build sequence to follow the data flow, so that functions that produce results used elsewhere in the
system are tested first.

Another aspect of defining software builds is specifying the test files required for each build, since some
may not be available early in the project, or the option may be taken to let the system itself produce some
of the test files. Test tools required for each build will depend on the environment and the type of testing
being done. For software builds, the final step is to define the test runs, i.e., groups of test cases, to be
executed. In this top-down design process, the test cases are specified later.

For non-software builds, list the specific components being delivered, the criteria that they must meet,
and the approaches to validation. Test runs should be defined if the validation approach includes testing,
e.g., having users run the system based on the instructions in the user manual.

Sections 2.6 and 2.7 will define the parallel and pilot test approaches. Since the objective is to validate
the system in a production environment, the entire system must be tested, unless the system will be
placed in production in stages.

 For a parallel test, the key issues are what previous system or process the new system or
release will be compared with, what specific results will be compared, and how the comparison
will be verified.

 For a pilot test, the key issues are to define the limited, pilot environment and how it will be
expanded to full production, and to define the criteria for success.

Typically, these tests simulate normal business processing, which then defines the sequence in which
system functions are tested.

Section 2.8, Requirements Validation Matrices, presents a suggested approach to verifying requirements
coverage. While other approaches may be used, this presentation clarifies the key concepts, which are:

 High level requirements should be covered (validated by and mapped to) builds.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  37
4/19/04
 Intermediate level requirements within each build should be covered by test runs.

 Detail requirements within each test run should be covered by test cases (which will be planned
in Section 5).

 The planning of tests at each level is done concurrently with the preparation of the Requirements
Validation Matrix or other coverage measure, to assure the adequacy of the tests in covering the
requirements.

 Generally a build, test run, or even a test case, may test a group of related requirements at a
given level.

 Some individual requirements, e.g., a requirement for context-sensitive help, may need to be
tested in multiple test cases, test runs, or even builds.

The online test case/script database may be used to establish coverage by mapping the test cases to
specific system functions.

UAT Plan Section 3. Work Plan

Section 3 presents the Work Plan, i.e., the hierarchy of tasks and milestones with start and end dates,
estimated effort, and assignment of staff. Section 3.1, Organization and Responsibilities and Section 3.2,
Major Tasks, Milestones and Target Dates, should be completed in the body of the document as soon as
possible for the convenience of readers and reviewers, while Section 3.3, the Work Breakdown Structure,
will be developed using a tool such as MSProject and presented as an attachment to the plan, using the
sample Work Breakdown Structure (WBS) as a guide.

Each task must be defined in terms of effort, staffing, start and finish dates and dependencies, i.e., logical
predecessors. The predecessors can be defined for groups of tasks. There are four kinds of task
relationships:

 Finish-to-start (FS): The task (B) cannot start until another task (A) finishes.
 Start-to-start (SS): The task (B) cannot start until another task (A) starts.
 Finish-to-finish (FF): The task (B) cannot finish until another task (A) finishes.
 Start-to-finish (SF): The task (B) cannot finish until another task (A) starts.

These relationships follow from the dependence of a task on some output of a previous task. The critical
path, the series of tasks that must be completed on schedule for the project to finish on schedule, will be
determined by the planning tool. Each task on the critical path is a critical task.

Section 3.4, Resources Needed for Test Execution, asks for a logical specification of hardware, software
and networks so that the test team, together with support staff, can select the actual components and be
sure they will be available when required. It is essential to document and respond to any issues that may
delay or limit the availability of these resources.

UAT Plan Sections 4-6. Procedures, Case Design, Test Execution

Section 4 documents the procedures that will control test execution. The scope of each procedure is
defined in the introductory text. When writing the test plan, any existing standard procedures need only
be referenced. Details of each procedure must be documented in the plan if no standard procedure
exists or if the procedure will be varied from the standard.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  38
4/19/04
Section 5 presents the breakout of test runs into test cases. At this point, the test cases are being
planned, or designed in terms of what each one will cover, but not actually developed. The planning of
test cases is done concurrently with the preparation of the Detailed Requirements Validation Matrix, as
described above.

Once the test cases have been planned, their development can be started. (See Test Case, Script and
Data Development.) This task can be delegated to one or more team members, who can work in parallel
on different builds and test runs.

Section 6 contains the remaining information needed for test execution. It can be developed in parallel
with the development of the test cases.

 In Section 6.1, Components to be Tested, the individual software modules and procedures are
listed as the information becomes available. It will be important to be able to verify that a build
contains the correct component versions, since missing or incorrectly migrated components can
cause test cases to fail while the correct component or version is not actually being tested. Any
such problems must be eliminated before the defects in the intended deliverables can be
identified.

Manual (user and administrative) procedures will be used to control the test, thus they must also
be identified. Those procedures that are new or changed will themselves be tested.

 In Section 6.2, test data components are identified. Using incorrect test data can produce
misleading results by causing test cases to fail without being performed correctly. This section
documents for all concerned the correct name and type for each test file. Indicating the type of
access is essential in planning whether and how the component must be refreshed after it is used
in each build.

This completes the sections of the standard test plan.

Test Case, Script and Data Development

The development of the actual test cases is done in an online database. The database has been set up
so that a number of scripts are associated with each test case. This capability can be used in different
ways. Generally, the test case embodies a given function or logical condition and the scripts contain
specific entries and other user actions (e.g., pressing function keys) that will test this condition. In the
discussion below, for convenience, “test cases” refers to cases and their associated scripts

A checklist of different types of test cases is found in the Test Plan Template, Section 2.2.

Test cases are prepared with pre-defined expected results to validate the detailed requirements identified
during the development of the test plan. Test cases should generally be planned, described and written
with a “destructive” orientation. That is, the goal of each test case should be to demonstrate that the
system does not perform correctly. Examples are as follows:

 A positive functional test attempts to show that for at least some combinations of data and
options the system does not perform a given function correctly. This can be done, e.g., by using
“boundary” data values that are likely to produce a failure if the code has errors. If such cases
run correctly, confidence is high that other valid inputs will be processed correctly as well, while
the reverse is not true.

 A negative functional test attempts to show that under some (or all) error conditions the system
does not produce the correct error message.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  39
4/19/04
Acceptance test cases are generally “black box” oriented, for several reasons. The user views the
system from the outside and is primarily concerned with system inputs and outputs. The user is not
familiar with the internal logic or code, or the physical structure of the system. Thus, user acceptance
test cases are developed from the business requirements and functional design documents, and can
therefore be developed before the detailed system design and code structure are known, thereby
accelerating the project schedule.

During the test case development process, specific conditions for test data files are identified. Some test
data will generally be available from prior testing levels. If additional test data is needed, a decision must
be made as to the approach - who will prepare and generate it, and how. The UAT team, users or
sponsor and developers must work closely together to make sure that the test data provided will
guarantee a complete acceptance test.

Test case development includes selection of archived cases and scripts for regression purposes. (See
the above discussion of regression tests.) When all test cases and scripts have been written, the work
plan for the last stage of acceptance testing: test case execution, should be revised if needed.

User participation in developing the test cases and scripts should be sought to the maximum extent
possible. At a minimum, the test cases and scripts must be reviewed by the users prior to being
finalized.

Prior Test Results Review

It is important to review results of prior test levels as soon as they are completed. Results should be
reviewed by build if delivery to UAT is by build. Otherwise, in a large project, review at intermediate
points is still advisable. Among the objectives are the following:

1. Verifying whether prior levels of testing are on schedule to plan for possible delays in starting
UAT.

2. Verifying that the tests are adequately planned and fully completed, that problems have been
resolved, and that the system is acceptable for UAT. In some cases it may be necessary to “test
around” areas where there are still unresolved problems.

3. Determining the concentration of errors in different areas of the system, and planning the
distribution of user acceptance test cases accordingly.

Test Execution Management

Among the responsibilities of the UAT project leader during test execution are the following:

 Be sure that each team member has access to, and is generally familiar with, the current overall
test plan and work plan, and knows his/her part in it, particularly current and upcoming tasks.

 Be sure each team member has access to the tools, files and programs needed to do the work -
resolve any conflicts or delays.

 Verify that builds arrive correctly and on schedule from developers and migrate correctly and on
time to parallel or pilot testing if applicable.

 Help team members solve technical and operation problems.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  40
4/19/04
 Coordinate the follow-up of problem reports.

 Be familiar with, and report, the status of all tasks, test cases, test runs and builds. The test
case/script database makes a considerable amount of status information available on-line.

 Coordinate the acceptance process with the users.

The project leader must be continually aware of the status of all tests and tasks. Depending on the
nature of the project and the people involved, approaches include:

 Reviewing on-line test results and problem reports, as well as automated summaries of these.

 “Management by Walking Around” (MBWA).

 Meeting formally with each team member, or the entire team, at least weekly.

The objective is not only to perform status reporting but to take corrective action, wherever possible, to
keep the project on schedule. Note that these management techniques apply generally to test planning
and test case development as well, particularly if multiple team members are involved.

Status Reporting

As testing proceeds, progress must be reported to development and UAT management, the business
owner of the system, other user management, and the development project manager, as well as to the
team members. The UAT Status Report is the primary means of disseminating this information.

It is emphasized that status reporting begins with the first reporting period after project initiation. The
status report should be prepared every two weeks or twice a month. Therefore, multiple status reports
will be produced during test plan and test case development as well as during test execution.

The project leader may tailor the contents of the report to satisfy the needs of the project. However, in
the narrative portion of each status report it is essential to

 Describe and explain all variances to testing-related tasks, including changes to either the
amount of estimated effort or the target dates.

 Explain the potential impact on the project whenever it appears that tasks will not be completed
as scheduled, e.g., if critical tasks are late.

 Describe any revisions to the work plan.

 Identify what deliverables are being tested, if test execution is in progress.

 Indicate the status of each deliverable, e.g., the degree of completion of the test plan and the
progress made in testing each build.

 Summarize pertinent testing issues and problems, and provide the current disposition of each.

 Describe testing activities that are planned for the next period.

 List and describe outstanding user and developer responsibilities.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  41
4/19/04
A statistical summary should include, for the period and cumulatively, the number of:

 Requirements/design problem and enhancement reports, by priority/severity/status

 Test cases written in test case development and added during test execution

 Cases written but not tested (this item and those below beginning in test case execution)

 Cases failed

 Cases passed, and percentage

 High-level requirements successfully tested. identifying the areas

 Test execution problem and enhancement reports, by priority/severity/status.

Other statistics (measures) that can be reported cumulatively and used in future test planning include the
number of:

 Detail level requirements/screens/modules, etc.

 Hours spent in test plan development, including work plan development

 Hours spent in test case/script/data development

 Hours spent in test execution.

 Problem and enhancement reports written broken down by insertion point.

The insertion point is the project phase or product in which the problem was first introduced.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  42
4/19/04
Sample Work Breakdown Structure

The following WBS is also available as an MSProject file that can be expanded to create the Work Plan.
See the User Acceptance Test Plan Template.
Note that (M) indicates a milestone, i.e., an event with no resources or duration generally marking the completion
of a significant deliverable by either Technology or UAT. Other items are tasks to which resources, e.g., staff
effort, and duration are assigned.

UAT Initiation
High-Level Business Requirements Document Completion (M)
Obtain Project Orientation
Organize UAT Team
Determine User Participation and Contacts
Provide Project Orientation to Team
UAT Planning
Initial Work Plan
Develop Initial Work Plan
Review Initial Work Plan
Revise Initial Work Plan
Detailed Business Requirements Document Completion (M)
High Level Design/Functional Specification Completion (M)
Prototype/Detailed Design Specification Completion (M)
UAT Plan
Develop UAT Strategy
Review UAT Strategy
Revise UAT Strategy
Requirements
Identify and Validate Requirements
Develop Requirements Hierarchy
Review Requirements Hierarchy
Revise Requirements Hierarchy
Completed Work Plan
Develop Completed Work Plan
Review Completed Work Plan
Revise Completed Work Plan

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  43
4/19/04
UAT Case, Script and Data Development
Build m
Test Run n
Develop UAT Cases/Scripts
Review UAT Cases/Scripts
Revise UAT Cases/Scripts
Test Data Component p
Check Available Sources, Determine Approach
Create or Obtain Test Data
Pre-UAT Test Results Review
Unit Test Completion (M)
Review Unit Test Results
Integration/System Test Completion (M)
Review Integration/System Test Results
Normal UAT Execution
Setup/Check Out Test Environment
Setup/Familiarize with Test Procedures
Setup/Familiarize with Test Tools and Libraries
Prepare/Initialize Test Data
Software Build m Test Execution
Delivery to UAT (M)
Test Run n
Execute Test Case p
Verify Test Case p; Report Problems
Non-Software Build m
Delivery to UAT (M)
Item n Validation
Perform Item n Validation Step p

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  44
4/19/04
Parallel Test Execution
Setup/Check Out Test Environment
Prepare/Initialize Test Data
Software Build m Test Execution
Test Run n
Execute Test Case p
Verify Test Case p; Report Problems
Non-Software Build m
Item n Validation
Perform Item n Validation Step p
Pilot Test Execution
Setup/Check Out Test Environment
Prepare/Initialize Test Data
Software Build m Test Execution
Test Run n
Execute Test Case p
Verify Test Case p; Report Problems
Non-Software Build m
Item n Validation
Perform Item n Validation Step p
First Acceptance Transmittal to User (M)
Final UAT Regression Test
Plan Final Regression Tests
Software Build m Test Execution
Test Run n
Execute Test Case p
Verify Test Case p; Report Problems
Second (Final) Acceptance Transmittal to User (M)
UAT Management and Status Reporting

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  45
4/19/04
2.04 UAT Estimating

Introduction to UAT Estimating

Estimating effort is a key part of planning any project. The UAT organization plans and manages the
user acceptance test for a development or maintenance project as a separate project in itself, and the
testing manager is responsible for developing the work plan and managing as closely as possible to the
plan. This procedure covers the development of estimates of effort for the tasks in a UAT work plan.

This procedure assumes basic familiarity and experience with the concepts and techniques of project
planning and estimating. For the most part, it presents the specific considerations and steps for
producing estimates for UAT projects. Development of the work breakdown structure (WBS) - the
hierarchy of tasks - is discussed in Section 2.03, Test Planning and Management.

In the planning phase of a UAT project, a test plan is developed that contains all the information needed
to guide the remaining phases. The Work Plan is an essential part of the test plan. The Work Plan
contains task, resource and schedule information and should be prepared using MSProject. The
estimates are a key element of the Work Plan. The Work Plan is developed in phases, with increasing
detail and accuracy as the UAT project proceeds.

UAT Estimating Inputs

All the information needed to prepare the estimates of effort is developed in the course of preparing the
UAT Plan and recorded in the UAT Plan and its attachments. There are three types of inputs, listed in
order in which they are utilized:

1. General underlying assumptions for the specific project that are developed early in the UAT
planning phase and listed in Section 1.3 of the UAT Plan. It is emphasized that all estimates
depend upon such underlying assumptions, e.g., the thoroughness of the testing that preceded
UAT, and the skill levels of the testing staff. During the planning phase, factual information is not
fully available in all of these areas, hence assumptions must be made.

In addition, Section 1.4 documents risk factors, including the possibility that some assumptions
may be violated. The impact of some risk factors occurring may be that testing effort will
increase.

2. The experience of the organization on similar projects. Given the differences among
organizations, general “industry” guidelines or rules of thumb can be used only as
reasonableness checks. Using them will still require project-specific inputs such as the amount
of time spent in coding, which initially are themselves only estimates at this point.

3. Detailed information relating to the specific project that is recorded in Sections 1 through 5 of the
UAT Plan. This information is developed as UAT planning proceeds, enabling the estimates to
be prepared and revised in stages as described below. Among the most important items of
information are the number and complexity of detailed requirements, the type of development life
cycle and technology environment and, as it becomes known, the number and types of test
cases and scripts that will be executed.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  46
4/19/04
This procedure describes a “bottom-up” estimating process, in which, while preliminary estimate are
prepared at a high level, the project is ultimately broken down into small tasks, each task is estimated,
and the estimates are added up. The individual estimates are derived primarily by judgment, rather than
by formulas.

Developing Phase-by-Phase Estimates: Concepts

Generally, as a project proceeds, each phase produces information that makes possible more detailed
and accurate estimates for the remaining phases. The revised estimates are developed at the end of
each phase. In a testing project this is particularly true since planning, and developing the test cases that
will later be executed, are major phases of the project.

In MSProject, actual work estimates are created in the process of assigning resources (staff) to tasks,
which is ultimately done at the detailed level. It is suggested therefore that preliminary estimates be
entered in MSProject as duration estimates. The tool can then be used to store, sum up and document
the estimates. As the full detail of the WBS is developed, the estimate for each higher level task is
revised and apportioned among the detail level tasks, and the baseline plan is updated.

It is also possible to create a revised plan if progress differs significantly from what was anticipated. This
is referred to in MSProject as an interim plan, and MS Project will store up to five at a time. It is
recommended that interim plans not be created routinely, but only with management and user approval if
there are major deviations with assignable causes. An interim plan is created with the objective of
revising the estimates for tasks yet to be completed.

Interim plans are created so that the original plan, and any prior interim plans, are not lost when the plan
is revised. Estimates of effort and duration may be revised only to incorporate the variance that has
already occurred, or it may be anticipated that similar variances will occur for the remaining tasks or
phases. Revising estimates for phases and tasks already in progress is conceptually a separate process
from estimating future phases. Generally, such revisions should be required only for the Test Execution
phases; possibly for Test Script/Case/Data Development on a large or complex project.

The major phases of a UAT project, as listed in the sample WBS, are

 UAT Initiation

 UAT Planning

 UAT Case, Script and Data Development

 Prior Test Results Review

 UAT Execution phases, including final regression testing and possibly parallel and/or pilot testing.

Other tasks that appear at the top level of the WBS are

 First and Second Acceptance Transmittal to User

 UAT Management and Status Reporting

It is emphasized that UAT phases, including planning, must be estimated as accurately as possible, with
a complete WBS and resource assignments, before they are commenced. Only UAT Initiation (by
definition) may begin without prior documented estimates. The steps to be performed in each phase are
detailed below.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  47
4/19/04
Estimating Phase 1: UAT Initiation

STEP 1.1: Information Gathering/Review

It is never too early to begin gathering information on the project from the developers and users.
However, the UAT Initiation phase should be completed when the developers are beginning the functional
specifications or high level design, and the Test Planning phase is about to begin. At this point the
business requirements are complete.

Key information that should be analyzed includes the following qualitative inputs:

 Type of project (new development or maintenance) and type of life cycle - waterfall, Rapid
Application Development (RAD), etc. A totally new system will be unfamiliar to the staff.

The sequence of UAT tasks has been keyed to the PDLC, which is a waterfall life cycle - a
traditional lifecycle in which the tasks occur once, in sequence, for each project.

In RAD, and other iterative lifecycles, some or all tasks are repeated during the course of a
project, e.g., repeated cycles of requirements/design and coding. The sequence of UAT tasks
might need to be adjusted to correspond with an interative development lifecycle, particularly if
the successive implementations are each released to production.

 Technology(ies) being used and the extent of each. The testing staff may have varying levels of
familiarity with these technologies.

 Extent and quality of documentation of requirements and any functions already implemented,
which will impact the effort required for requirements validation and test case design.

 Developers’ approach to unit and integration testing, which will impact the required effort in UAT
for test design, prior test case review, and test execution.

The quantitative inputs, which in some cases will be estimates, include

 Number of new or changed requirements (or sheer bulk of documentation if not decomposed),
number of new or changed screens, modules, files, etc.

 Effort for past and future life cycle phases, respectively. (Requirements and design may be
completed; coding not yet begun, hence the coding effort figure will be an estimate.)

 Time available between turnover to UAT and desired/required turnover to production. (A shorter
timespan for UAT may require more staff resources, assuming the schedule is feasible.)

Similar factors for past projects of similar type and magnitude, and the actual UAT effort required for
these, should then be examined to the extent possible.

STEP 1.2: WBS/Estimate Preparation

The UAT Planning phase tasks should now be completely detailed and estimated, based on the
information obtained so far. The estimate for detailed requirements validation will depend on the size and
quality of the requirements documents to be examined. Generally, however, the planning phase is less
directly dependent on project size than later phases and is a small portion of the total effort. Hence the
estimates should be fairly reliable and will not have major impact on the total UAT estimate.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  48
4/19/04
The Pre-UAT Review phase should also be estimated in detail, since it contains few detailed tasks and is
dependent only on prior testing phases. This estimate can be revised in the next phase if needed.
Similarly the Acceptance Transmittal tasks should require little detail and preliminary estimates should be
made.

For the UAT Execution phases, a preliminary (possibly final) determination should be made as to whether
pilot and/or parallel tests will be required, and preliminary high-level estimates made, including an
allowance for final regression testing based on past experience. These estimates will later be detailed
and revised.

Finally, an estimate should be made at this time for the ongoing UAT Management and Status Reporting
task. This task is a “bucket” for the time spent on tracking and management (not planning) activities. It
may be broken down on a large or complex project, e.g., by the major UAT phases, since the level of
effort may vary. The best approach is to multiply a weekly level of effort by the expected number of
weeks. The level of effort increases with the number of people on the project. It will range from a
minimum of 1/2 day per week during all phases to full time commitment by the testing manager during the
test execution phase of a very large project.

The overall estimate in this phase is only as good as the estimate for test execution, the largest phase,
for which the detailed WBS itself will only be developed in Phase 2. Depending on the amount of
previous experience that can be brought to bear, the overall estimate at this stage may be off by as much
as a factor of 2 or 3. However, the best possible estimate of overall project effort should be made at this
time.

Estimating Phase 2: UAT Planning

STEP 2.1: Information Gathering/Review:

The UAT Planning phase should begin immediately after the completion of UAT Initiation. In this phase,
detailed information about the project is documented in the UAT Plan and carefully analyzed, as
discussed in the Test Planning and Progress Tracking procedure. The process of obtaining,
documenting and analyzing the information provides the remaining inputs needed to develop the next set
of estimates.

All of the general information on the project, and similar previous projects, has already been gathered. In
this phase, the detailed information gathering consists mainly of requirements analysis, i.e.,
decomposition and validation, as discussed in the Requirements Validation and Defect Reporting
procedure.

STEP 2.2: WBS/Estimate Preparation/Revision

The primary objective of test planning is to establish the hierarchy of builds, test runs and test cases, as
discussed in the Test Planning and Progress Tracking Procedure. This will enable the WBS for the UAT
Case, Script and Data Development and UAT Execution phases to be finalized.

A suggested procedure for improving the preliminary estimates of both Test Case/Script Development
and UAT Execution is to divide the test cases into 3 to 5 categories based on size, complexity,
technology, etc.; make each category an intermediate level task which can later be eliminated as detail
level tasks are written; and multiply an estimate for the typical test case within each category by the
estimated number of cases.

To obtain the typical case estimates, alternatives include writing a few sample cases in each category, or
reviewing actual effort for similar test cases on other projects.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  49
4/19/04
Estimates for Test Data Development should be developed individually for each file, based on the steps
needed to develop it or otherwise arrange for it to be available, which will often depend on the build
strategy. It should be possible to finalize these estimates in this phase.

Since the test cases and scripts are not actually written in this phase, the estimates for test case
execution will be finalized in the next phase. However, the non-software deliverable validation steps
should be fully described, and can be completely estimated, in the planning phase.

To obtain the estimates for the UAT Execution phases, first estimate the time required to physically set
up, execute and validate a test case in each category. Note that if test cases are executed by an
automated tool, setup and validation are still required, but the estimate is for a series of test cases (test
run). The estimates for manual test execution will depend on the number of scripts per case and steps
per script. This can be estimated from the detail level requirements and past experience.

The estimate should then be multiplied by a factor that allows for the time spent in problem reporting and
follow-up, and reruns. A factor in the range of 2 to 3 is generally appropriate. A lower factor would be
used if the expectation is that relatively few errors will be found and vice-versa. The expectation is based
on the perceived quality with which steps from requirements to integration testing have been performed
by the developers. Experience with previous projects should be brought to bear.

The test plan also includes descriptions of the testing procedures. The nature of these procedures (e.g.,
standard versus custom), and the familiarity of the staff with the procedures and tools, will impact the
time estimates for actual test execution, problem reporting and follow-up. In some cases, training tasks
beyond project orientation will need to be added to the WBS.

If additional information about prior testing is available at this time, it should be documented in the
Assumptions sections as noted above, and the WBS and estimates for the Pre-UAT Test Results Review
phase should be revised. The estimates for the Acceptance Transmittal phases can also be revised.
The estimate for UAT Management and Status Reporting may also be revised and further detailed into
subtasks if desired.

Estimating Phase 3: UAT Case, Script and Data Development

STEP 3.1: Information Gathering/Review

The new information developed in this phase that will be used to revise the estimates of later phases is
the actual test cases and scripts for normal UAT execution, as well as any changes that may be made to
the structure of builds, test runs and test cases. Examination of the actual tests that will be run will
validate or modify the estimates made previously for test execution.

Parallel or pilot tests are generally carried out based on a standard sequence of business transactions,
rather than a set of individual test cases. The sample WBS allows for the use of builds in parallel and
pilot testing, because functionality may be delivered to these test phases, and possibly to production, in
builds or releases. Test runs will represent the on-line and/or batch business processing sequences.

STEP 3.2: WBS/Estimate Preparation/Revision

With the test hierarchy and parallel/pilot testing approach fully developed, including the actual test cases
and scripts, all of the information that can be available before test execution begins is now present. The
test hierarchy translates directly into the WBS, which should now be completed for the test execution
phase(s). The WBS for parallel and/or pilot testing should also be completed in this step, and the
estimates detailed and revised, if not completed previously.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  50
4/19/04
The effort associated with test case execution may be apportioned among multiple types of tasks. As an
example, it may be desired to separate problem follow-up from test execution and problem reporting,
since follow-up will extend much further in time, while the execution of the test can be reported as
complete. As a further example, separating problem reporting from test execution may be done at the
test case level if the test cases are very long and complex or for some other reason the problem reporting
is done after the test execution is completed. Problem follow-up and/or reporting could also be separated
at the test run level, e.g., if an automated scripting tool is used.

With the test cases and scripts fully developed, it is appropriate to review and revise the estimates as the
WBS is completed and resources are assigned. At this point, the number of scripts in each case, and the
number of steps in each script, are known. Estimates are required for each lowest level task. Estimating
each case or script individually is theoretically possible, but involves a great deal of effort and is probably
not reliable.

A better idea is to use one or both of the following approaches:

 Use a formula based on the number of scripts and steps per case that gives good results when
applied to one or more similar past projects.

 Retain the approach used earlier of dividing the cases into groups. A larger number of groups
could be used, but more than 5 to 7 is likely to be excessive. Either manually revise the estimate
for each group, or combine the group approach with the formula approach by varying the formula
by group.

Another alternative is to estimate only at the test run level, if the task size is not excessive.

Estimating Phase 4: UAT Execution

For convenience, all UAT execution phases will be discussed here.

STEP 4.1: Information Gathering/Review

The essential information being gathered is the actual progress of the testing, in terms of meeting the
planned level of effort. Related data are the number and severity of problems being found. In addition,
new information related to parallel/pilot test execution may be obtained during normal UAT execution,
particularly if it was unavailable previously.

STEP 4.2: WBS/Estimate Preparation/Revision

As noted above, Test Execution may consist of one or two (rarely three) phases, i.e., normal UAT,
parallel and/or pilot testing. The information discussed above, obtained during normal UAT execution,
may help improve the estimates for parallel or pilot testing.

In addition, the final regression tests are planned and estimated in detail just prior to being executed. The
reason for this is that it is not possible to know in advance what problems will remain at this stage.

Typical Allocation of Testing Effort

This section presents some typical industry figures for the allocation of testing effort within a project. As
emphasized in this procedure, all estimates must be based on the factual information available at the time
the estimate is made and on a set of assumptions about the specific project. Given the many ways in

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  51
4/19/04
which projects can differ, typical figures can provide only a reasonableness check on a given set of
estimates.

In particular, the applicability of the typical allocation of effort for UAT is entirely dependent on how well
the allocations for prior testing levels apply to the project. As was seen in the procedure, early test
planning implies that the effort figures for all phases from programming on are themselves only estimates
at the time that UAT effort is first estimated.

The validity of the typical allocation is also dependent on the extent to which the test levels used in a
given project conform to the definitions listed below, used by the consulting organization that developed
these figures. The typical allocations are presented below the definitions.

 Unit Testing: The level of testing at which the smallest units of a system, i.e., modules, are
tested separately.

In the traditional development approach, unit testing may be estimated as approximately one-
third (33%) of the total programming phase effort, where the other two thirds encompass
preparation of detailed specifications (e.g. module specifications or "mini specs") and coding.

Unit testing effort can be heavily influenced by other factors, however. Examples include the
testing of code that has been automatically generated (e.g. from a CASE tool), which should
have inherently fewer defects and thus should require less iterative unit testing; and GUI
applications, where the complexity of possible user action combinations should increase the level
of unit testing effort.

 Integration Testing: The level of testing that validates both internal and external system
interfaces by successively assembling and testing groups of modules known as builds. System
testing is the set of integration test runs that exercise the full system.

In the traditional development environment, integration testing effort should equate to from 20%
to 40% of the effort estimated for the programming phase. (Note that this is additional effort, not
a component of the programming effort.) The lower end of this range is appropriate for small or
simple projects, while the upper end should be used for large or complex projects, and for
projects where more than a very small percentage of the total code has been automatically
generated.

Client/server applications, or those using new technologies, should also be estimated at the
higher rate. This is to reflect the increased technical challenges likely to be faced by both the
developers and the testers.

Integration testing effort can be expected to be distributed in approximately the following ratios:

 10% Strategy/Planning

 30% Test Case Design

 60% Test Case Execution.

Note, however, that if automated scripting tools are used during execution of the testing, the
effort and elapsed time associated with this task is likely to be reduced.

 User Acceptance Testing: In this organization, the level of testing in which UAT, on behalf of
the users, determines whether the system operates according to its business requirements and
meets user expectations.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  52
4/19/04
In the traditional development environment, acceptance testing effort should be anticipated to
almost equal the effort estimated for integration testing. In this scenario, the effort in hours for
test strategy/planning and test case design should be very close to that allocated in integration
testing. Acceptance test execution effort should be less than integration test execution, on the
premise that at this point in the application's development most defects should already have been
removed.

However, the acceptance team must also validate manual procedures, effectiveness of training,
and user manuals and guides. This increase in scope may offset the time savings in test
execution.

A major factor influencing the estimates of effort for acceptance testing is the degree to which the
system's users have been involved in the application's evolution. If the users have been closely involved
throughout the requirements identification and application design, have participated in JAD (Joint
Application Development) sessions and in the development of application prototypes, etc., acceptance
testing can become almost a formality and may almost be viewed as an extension of the integration test.

However, if the users have been part of the development team, different users should be involved in the
integration test in order to adhere to the concept of independent testing.

In assessing the impact upon the acceptance testing estimate, previous experience with the application's
users provides a good indicator of the way in which the testing is likely to be conducted. In general, the
effort required may be reduced by some amount of the time that has been dedicated by the users to
activities such as prototype development and JAD sessions. There needs to be inclusion of some
duplication of effort, in order to allow for independent testing involving different users than those most
involved during the application's development.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  53
4/19/04
2.05 Requirements Validation and Defect Reporting

Introduction to Requirements Validation and Defect Reporting

The objective of all testing is to remove defects from the system being tested. Defects can be introduced
at any point in the life cycle, e.g., requirements, design or code, and can be discovered at any later point.
However, once a defect is introduced, it will generally be perpetuated in all later work products.
Therefore, the earlier in the life cycle that a defect is found, the less time consuming and costly it will be
to correct it.

The same applies, of course, to a proposed enhancement. The earlier it is proposed and accepted, the
fewer work products that will need to be changed.

As noted in the Test Planning and Management section, user acceptance testing starts when test plan
and test case design are begun. For acceptance testing, these tasks begin early in the life cycle, as soon
as the detailed Business Requirements Document is available. This work product is the primary input to
UAT planning and test case development. Other work products, such as the High Level
Design/Functional Specification, will also be utilized. For the purposes of this discussion, all of these
sources of information will be referred to as requirements.

Test planning is the beginning of User Acceptance Testing for two reasons:

 It is the first major involvement of the UAT group in the project.

 More significantly, it represents the first opportunity to test work products and remove defects
from the system early in the life cycle, as discussed below.

A defect in the requirements is an error waiting to be implemented. Correcting it at the earliest possible
point will prevent it from being introduced into succeeding work products, including the code itself.

As described in the Documentation Review section, UAT participates in formal reviews of all
documentation work products, to determine their overall quality and acceptability and identify major
problems and issues. Each product should be reviewed as soon as possible after it is completed. Such
a review of requirements will insure their adequacy for writing the test plan.

However, requirements documents will generally contain defects in their details even after passing their
overall reviews. These defects may only become apparent as the test plan and test cases are written.
Correction of these defects is necessary in order to complete the writing of a correct set of test cases.

To assure that as many defects as possible are corrected early, the tester should approach the
requirements with a testing mindset during test planning, as the system itself will be approached during
test execution. The tester should seek to identify and report all defects, as will be done during test
execution. This activity is known as requirements validation. Since this activity identifies defects, it is
also called static testing - test execution is then also called dynamic testing.

Requirements Decomposition

One of the best ways of analyzing requirements, for the purposes of both writing test cases and finding
defects, is requirements decomposition. In this process, the requirements are broken down into a

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  54
4/19/04
hierarchy, or tree, preferably with a qualified numbering system for reference. At each level of the tree,
the requirements are placed in a logical sequence, with related items grouped together. Each
requirement should contain two elements:

 A short title in the form, “Verb, [Modifier], Object,” e.g., “Print Weekly Summary Report.”

 A short paragraph providing additional detail. If more than a short paragraph is needed,
additional levels of hierarchy should generally be used.

It is of course possible to attach illustrative material such as screen layouts. The hierarchical breakdown
continues until a testable level of detail is reached. As an example, the above requirement is valid, but
too high-level to be testable. “Print Customer Name,” with a specification of field size, is testable.

This breakdown makes it possible to verify that the requirements are adequately covered by test cases.
It also tends to clarify the material in the mind of the tester and expose defects. Actually doing the
decomposition is thus beneficial to the tester. However, preparation of requirements in this form by
analysts should be encouraged. If requirements are not originally prepared in this form, and time is
limited, writing only the requirement titles in a hierarchy will yield a significant portion of the benefits.
However, the tester will have to refer back to the original document when writing the test cases.

Types of Requirements Defects

There are several types of defects that may be encountered:

 Omission/Ambiguity: A specification that would be needed to write test cases and, indeed, to
code the program, is missing, incomplete or unclear.

 Inconsistency: A specification contradicts another specification in the same or a different


requirements document.

 Standards Violation: A specification violates a standard for the system or for the entire
organization.

 Other: A specification is known or believed to be incorrect for any other reason.

Requirements Defect Reporting and Follow-Up Responsibilities

The tester’s primary responsibility in this area is to be alert for problems in the requirements documents
and report them clearly. A tester may wish to suggest an enhancement - a change or addition to the
requirements. As noted above, making these suggestions early in the life cycle, during test planning, is
far preferable to waiting until coding has been done.

It is, however, not the tester’s responsibility to resolve a problem or make the determination on a
suggested enhancement. Testers are encouraged to suggest problem solutions and specific
enhancements, but should focus principally on test planning and requirements validation. All corrections
and other changes must be made by the group responsible for the work product, who should also be
aware of, and examine, other areas that may be impacted by the change. Therefore, the tester must
report and assign the problem appropriately.

Given the tester’s responsibility both for the quality of the requirements and for developing the test plans,
cases and scripts, the tester is responsible for following up the resolution of the Problem Reports

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  55
4/19/04
submitted. Failure to obtain a resolution within the needed time frame, or to agree on the resolution,
should be escalated appropriately.

Requirements Defect Reporting Procedure

Requirements defects are reported using an online database. The form used is similar to problem report
used during test execution but different in some details, because a requirements problem report is not
linked to, or created from, a test. The procedure is as follows:

 Open the requirements problem reporting database, if not already open.

Compose a new Requirements Problem Report. The following fields in the heading are
maintained automatically:

Report #: The sequential number assigned to this report. (Automatically assigned, with
a prefix based on the tester’s name - first two letters of first name and first letter of last
name.

Status: Open/Scheduled/Resolved/Closed/Reopened. Initially set to Open, then


automatically updated as described below.

Fill in the fields in the first section appropriately as described below and close the document. The
explanation of the fields and buttons on the form is as follows, grouped under the sections of the
form.

Tester: Problem/Change Request Description

Reported By: Name of tester. (Dropdown list containing only the current user’s name.
User should enter the first letter of the first name to automatically fill in the entire name.)

Project: Name of the project. (Dropdown list, to which entries can be added.)

Reported Date: Date and time form is opened. (System will fill in automatically.)

Type: Problem/Change Request. (Dropdown list.)

Look & Feel: Yes/No. Does the problem or change deal primarily with “look and feel”,
rather than functionality? (Dropdown list.)

Priority: High/Medium/Low. As judged by tester within organization or project guidelines.


(Dropdown list.)

Release: The number of the release of the system that is being developed in the current
project. (Dropdown list to which entries can be added.)

Problem Type: Omission or Ambiguity/Inconsistency/Standards Violation/Other


(Dropdown List.)

Document: Select the specific work product type, e.g., Detailed Business Requirements
Document, that contains the problem or should otherwise be changed. (Dropdown list
with other entries allowed but not stored in list.)

Version/Date: The version or date of the document referred to above.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  56
4/19/04
Assigned To: The group assigned responsibility for resolving the problem or considering
the suggested enhancement. This should be the group that authored the document
entered above. (Dropdown list. The assignment will be to an Analysis or Design, rather
than a Development, i.e., Programming group.)

Assigned Date: The date that the above assignment was made. (System will
automatically enter the current date.)

Problem/Change Summary: Brief statement of the problem or change.

Reference: The item(s) in question in the document being examined. Use page and/or
paragraph number(s), section or requirement title(s), etc. as needed

Problem/Change Description: Complete statement of the problem or change. Indicate


what is in the text or the proposed functionality (or is missing) that constitutes a problem
or should be changed. In the case of a standards violation, indicate specifically what
standard and specific provision is being violated and in what way.

Supporting Documentation: If the above reference is clear, this can be omitted.


However, material, including graphics, from the requirements document (and possibly
from the standard being violated) may be pasted in and comments interspersed, if this
helps to clarify the issue.

Save: Button to save the document.

Print: Button to print the document.

 The analyst or designer will complete the next section. The fields are as follows:

Analyst/Designer: Resolution Description

Name: Name of the analyst or designer. (Automated entry similar to Reported By


above.)

Resolution Analysis: Description of what will be done to resolve the problem or


incorporate the enhancement, if accepted; if no action will be taken, the reason.

Resolution Type: Corrected/Enhanced/Rejected. (Dropdown list to which entries can be


added.)

Scheduled Resolution Date: The date when the original product will be changed, and
reissued. Entry here changes Status to Scheduled.

Include In: Change package in which the change or enhancement will be included.
(Dropdown list to which entries can be added, but not the same list as Release above.)

Actual Resolution Date: The date when the original product has been reissued or
otherwise made available, e.g., in an online database. Entry here changes Status to
Resolved.

Release Notes: Use if needed to record information for release notes.

 The tester will then review the revised product, just as a program fix would be tested. This will
often be necessary in any case, e.g., if some test cases could not confidently be written based on

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  57
4/19/04
the original document. The tester completes the final section. If the resolution is not satisfactory,
the problem is reopened, otherwise it is closed. The fields are:

QA/UAT Closure

Reviewed By: Name of tester reviewing the resolution. (Automated entry similar to
Reported By above.)

Date Reopened: Date that the report is being reopened (resolution rejected). Entry here
sets Status to Reopened.

Number of Reopens: Initially blank. Start at 1 for the first reopen and increment by 1
with each succeeding reopen.

Date Closed: Date that the report is being closed (resolution accepted). Entry here sets
Status to Closed.

Comments: Indicate the reason for rejecting the resolution and reopening the report.
Comments are optional if the report is being closed.

 At the bottom of the document, two fields are automatically updated whenever the document is
changed:

Modified By: The name of the person modifying the document.

Last Modified: The date and time when the last modifications were saved.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  58
4/19/04
2.06 Software Migration to UAT and Production

Introduction to Software Migration

The procedure for software migration is described fully in the next topic, Software Migration Procedure.
The procedure controls software movement from Development (Technology) to UAT; from UAT to
Production; from production to Development; (as in the case of a back-out of software); or from
Development to Production; (as in the case of an emergency change) is outlined below. It is
recommended that ChangeMan be used to its full potential to control the movement of software.

A summary of the procedure steps and responsibilities is as follows:

PROCEDURE STEP RESPONSIBILITY


1. Build Software Package Technology
2. Freeze Software Package and Move to UAT Technology, UAT
2A. Revert Package from UAT to Development Technology, UAT
3. Electronic Approval Technology, Operations, UAT,
Technical Support, Users
4. Promote Package to Production UAT
4A. Back Out Package from Production to Operations
Development
5. Implementation Burn In (Delivery Systems Only) UAT

Software Migration Procedure

The steps in the software migration procedure are listed below. Examples are provided in the next topic.

STEP 1: Build Software Package


Responsibility: Technology

Development work is performed in a separate region. Upon completion of coding, the software package
(the compiled software changes to programs, cataloged procedures, files, JCL, control cards etc.) will be
built (i.e. assembled and compiled in the development environment). After successful completion of unit
and system testing, the software package will be "frozen" by Development and signed off electronically in
ChangeMan by the Programmer/Analyst and Applications Manager for the unit. ChangeMan also
provides an area for documentation of the changes and a number of pages are available for recording the
necessary information needed by the parties involved in the process.

STEP 2: Freeze Software Package and Move to UAT


Responsibility: Technology and UAT

In order for user acceptance testing to be effective, the migration from Technology (development) to UAT
must be performed correctly. The essential tasks in migrating software are the following.

1. Verify that the Release Information Bulletin (RIB) is complete and comprehensive.

2. Verify the completeness of the turnover package.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  59
4/19/04
3. Migrate software to UAT library for Corporate and CCM Plus.

4. Secure the software in the UAT library for Corporate and CCM Plus.

5. Recompile the software in UAT for CCM Plus and build the release.

6. Compare the sizes of various modules for CCM Plus.

7. Produce DIF listing for CCM Plus. (Compare current and new source to create a difference or
change report.)

When a software package has been successfully frozen, no modification to the software package can
occur. The ChangeMan "freeze" process promotes the new or modified code from "staging" to "frozen"
levels and prohibits any changes without demoting the entire package to a "staging" library.

If the approval process has already begun, no changes can be made without demoting and reverting the
package. This includes the install date. If no approvals are on the package, the install date may be
changed without demotion.

ChangeMan will also record the promotion/demotion of the package and provide an audit trail and history
of the package. Should a modification be necessary, the package can be "unfrozen". The process to
"unfreeze" (or demote) a package is dependent on the step in the process. See examples in the table at
the end of this section.

Development and UAT will need to manage the migration of the changes when change packages contain
more than one version of the same module(s) (files, procedures, JCL, control cards etc.) that are being
worked on concurrently. If there are fixes in process to package(s) that were demoted, development will
incorporate the modifications, "freeze" the package, communicate the changes and inform Configuration
Management to re-promote the package so that testing can be resumed.

STEP 2A: Revert Package from UAT to Development


Responsibility: Technology, UAT

Should a problem occur with a software package in UAT, the package can be moved back to a
development (staging) state. This reversion of software would be done based upon joint discussion
between UAT and Technology Services.

STEP 3: Electronic Approval


Responsibility: Technology, Operations, UAT, Technical Support and Users

Upon successful completion of UAT testing, managers from Technology, Operations, UAT, Technical
Support and the users approve the software package and it is ready for promotion to production.

Every software package that is to be promoted should receive the proper levels of approvals by noon on
Tuesday each week so that the scheduled changes may be made available for discussion at the
Wednesday Change Meeting. If the changes, as they are presented, are not disputed, they will
automatically be migrated to Production according to the schedule by ChangeMan. The date of
promotion is jointly determined by Product Management, Development, and UAT and this is incorporated
into the change package.

In ChangeMan, there are only two types of changes; planned and unplanned. An unplanned, or
“emergency” change is one that was not scheduled; therefore was not presented at the Change Meeting.
Scheduling as many changes as possible to meet the "planned" criteria will add to users’ comfort level by
enabling them to have knowledge of the changes that are being implemented.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  60
4/19/04
STEP 4: Promote Package to Production
Responsibility: UAT

The objective for software package promotion is to provide the following:

1. A clear understanding of the new software and how it works in conjunction with other "production
- like" functions or activities that are or may be happening concurrently.

2. A secure, controlled production-like environment where user testing can be positioned to exercise
and further anticipate potential issues and production-like conditions.

Several possible business scenarios should be processed; regression testing and volume testing should
be conducted to the levels that can be supported within the environment. It is probable that as the
regions are more fully utilized and the array of test cases are augmented, we will be able to support more
extensive regression and volume tests.

Ideally, the appropriate regression tests and volume tests should be a joint decision with Technology and
user community participation.

Until the regions are more stable, the level of tests will be determined by UAT staff.

STEP 4A: Back Out Package from Production to Development


Responsibility: Operations

Should a problem occur with a software package in production, the package will be backed out by
Operations. Notification is provided by Operations to the programmer or Programming Manager on the
original Change Package approval list.

Once the back-out takes place, a prior version of software is put into production. The identical version
will need to be put into UAT at the same time or upon initializing the UAT region in question.

STEP 5: Implementation Burn In (Delivery Systems Only)


Responsibility: UAT

UAT will make the release available to the Implementation group for burn-in. UAT is available to answer
any questions.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  61
4/19/04
Examples of Software Promotion and Demotion

POSSIBLE STAGING DEVELOPMENT STEP UAT STEP


SCENARIO
1) PROMOTION:
1A) Existing Program; New 1A) Only copy? Yes/No 1A) When "frozen" promote to
functionality being UAT. All relevant
developed 1Aa) If Yes, unit & system test documentation must be
and advise UAT. received.

1Ab) If No, determine planned 1Ab) Receive communication


schedule for production for from development; mutually
all current and impending determine best approach.
change(s).

1Ac) If close, determine 1Ac) Determine level of testing


status/time frame of that can be done before
inclusion. "demotion" to include
revised software.
1B) New program(s); new 1B) Same as above; 1B) Promote into UAT;
functionality
"freeze" and advise UAT. Approve for promotion to
Production
2 DEMOTION
2A) While in UAT 2A) Receive notification that 2Aa) Communicate nature of
software has a bug. software bug(s) to
Development; provide
copies as possible (online;
hard copy);

demote package for fixes.


2B) After Production install 2B) Development Mgr. receives 2B) UAT regions will reflect the
call; software is backed out current level of software
of production and a prior when regions come up;
level of software is re- (prior versions of load
installed. library member(s) re-
installed automatically.
2C) Emergency Change 2C) Receives call; obtains 2C) Tests AFTER installation;
emergency ID; fixes code; will be informed that change
ensures that production has occurred via electronic
process resumes. notification (as above) to
IDs as determined.
Note that there must be an
STR# in the system.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  62
4/19/04
2.07 User Acceptance Test Execution

Introduction to UAT Execution

Before beginning UAT Execution, the UAT project leader should review the test plan and cases with the
Lead Product Manager/sponsor. Once testing is complete, the test plan will assist in the writing and
documenting of the test results in preparation of the Certification/Non-Certification Memo.

Testing will be performed in the UAT test environments: QADWCCA, QACCA or NJDEV.

As problems are identified, system trouble reports will be issued and tracked. The problems reported
could be, but are not limited to: software, hardware, communications, operating system, etc. In addition,
the tester may wish to suggest a change to the system. See the topic, Reporting Problems and Change
Requests.

When execution of each script is completed, a completion report is prepared indicating Pass or Fail, as
described in the topic, Reporting Test Results.

It is essential for the test project leader, and each tester, to assure that problem reports and change
requests are followed up, new cases/scripts developed once changes are implemented, and retesting of
failed scripts performed once problems are reported as fixed.

Reporting Problems and Change Requests

The purpose of the System Trouble Report (STR) is to communicate the problem effectively to
Technology. Each report will have a unique STR number as well as the release, version number and
program(s) affected if identifiable. Also, it contains a descriptive, understandable explanation so that the
programmer can attempt to reproduce the same result and determine a fix for the problem if possible.

Certain types of problems will prevent testing from continuing; others will allow testing of other non-
related paths to proceed.

Every resolved STR will be re-tested in the designated release of software.

Opening a Trouble Report/Change Request

A test execution STR can be created in the online database directly from the Test Script document that is
being executed.

As a rule, one problem should be reported per STR. If this is a change request assign it to Delivery
Systems for their input on your suggestion. The key section of the STR form, and the key fields to be
filled in by the tester, are described below.

Tester: Problem/Change Request Description

Test Summary: If related to a specific test case indicate the section name and/or the number. If
it is not related to a test case one should be built.

Problem Summary: This is a one line description of a single problem or change request.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  63
4/19/04
Problem Description: The exact details of the situation are explained step by step in this section.
This part of the STR is critical in the investigation, recreation, retesting and decision making for
the problem or change request.

If conversations result due to questions, the date and UAT testers initials should appear before
the summary of the discussion.

Supporting Documentation (CCM Plus): The A2B Communications screen and/or the DTR file is
necessary depending on the type of STR and the actual CCM Plus screen(s).

Supporting Documentation (CCM): Must contain copies of the related screen(s). If only a
printout is available it should be saved with the STR# recorded on the top and a note indicating
where the original can be located should be entered.

UAT Closure

Closed: The STR will be closed if the specific problem is fixed. Enter the date closed and attach
all supporting documentation to the Test Completion Report. If another problem results from this
fix, open another trouble report. Cross reference this in the comments field on both the STR and
the Test Completion Report.

Reopened: If the problem has not been resolved, enter the number of times the STR has been
reopened in the reopened field and the date it is being reopened.

Comments: The comments field on both the STR and the failed completion report should contain
a line with the date, tester and exact reason for reopening. Enter any additional information that
is relevant to the STR.

Reporting User Acceptance Test Results

Once a script has been completed the results must be reported. The Completion Report can be created
in the online database directly from the Test Script that is being executed.

Test Passed: Yes/No. Causes “Passed” or “Failed” to display in the view.

Supporting Documentation: Once a script has been completed the results must be included. All
test results will be saved in this section. When making specific changes, where applicable, the
before and after screens should be included, e.g., if making a change to a user’s options.

Comments: This section contains any additional information relevant to the case that the tester
needs to know. If the test case fails, add the associated STR# to the comments section. Also
include a brief description of the reason for the fail.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  64
4/19/04
2.08 Test Results Archive

Test Results Archive Requirements

Once a project is complete, test results will be maintained on site for a minimum of one year. According
to corporate guidelines set by the Audit Division, major enhancements require a one year audit trail.

The hard copy file will include a listing of all test plans, cases and scripts. Test results will be logged and
filed by Project Number. When it is necessary to review and/or remove test results, UAT will assist in
locating the documents and ensure that the sign-out log is completed. See the Documentation Control
Log form.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  65
4/19/04
2.09 Certification Process

Certification Criteria

A release may be considered for certification when the following criteria have been met.

 User Acceptance test plan has been successfully executed.

End-to-end testing has been completed (when applicable).

Full regression has been performed (when applicable).

 The test results have been reviewed by the users.

 The trouble report log has been reviewed and there are no outstanding severe application related
trouble reports.

 Other open trouble reports have been discussed with development and product management;
decisions have been made regarding closure, based on an assessment of risk.

 UAT evaluation and certification memos have been completed.

UAT and the users who have reviewed the test results must both approve the readiness of the release for
certification. In addition to UAT signoff, the sponsor, i.e., the owner of the application, is also required to
sign the release memo.

Certification Procedure

The certification procedure consists of two stages. The second stage is required if user approval is not
obtained in the first stage.

Certification Procedure Stage 1

1. The UAT team manager transmits a Certification/Non-Certification Memo, discussed further in the
next topic, to the sponsor/system owner, indicating that the user acceptance test has been
completed. Note that the memo is required as a notification when the scheduled testing has been
concluded, even if the release is not considered acceptable and is to be sent back to Technology.
The remainder of the procedure assumes that UAT has indicated that the release is acceptable.

2. Within an agreed time period, the sponsor reviews the test results, and performs or requests any
additional validation that is considered appropriate.

3. If the system, build or release is accepted, the user so indicates by memo to the UAT team manager.
The UAT team manager so informs the development team manager, and implementation proceeds.
If the system, build or release is not accepted, the user so indicates by memo, indicating what further
work is needed.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  66
4/19/04
4. The UAT team manager verifies that the problems and changes are clearly and specifically
described, returning any unclear requests to the user for clarification. The requests are then
forwarded to the development team.

Certification Procedure Stage 2

If problems were found, or changes requested, by the user during Stage 1, these are reviewed by the
development team manager and UAT team manager, and target dates are established for problem
resolution and/or implementation of changes. The corrections and changes are made and tested by the
development team.

Once the system or build is repromoted to User Acceptance Testing, the failed tests, as well as
appropriate new or revised tests (for requested changes) and regression tests, as determined by the user
acceptance team manager, are executed. Any new discrepancies are reported and resolved between the
user acceptance and development teams. The Stage 2 acceptance procedure is as follows:

1. A Certification/Non-Certification Memo is sent to the sponsor, indicating that the problems and
changes identified in Stage 1 have or have not been resolved.

2. Within an agreed time period, the sponsor reviews the test results, and performs or requests any
additional validation that is considered appropriate.

3. If all Stage 1 problems and changes have been resolved, the user so indicates by memo to the UAT
team manager. The system, having met the existing requirements, will normally proceed to
implementation. Any new change requests constitute new requirements and will be treated as a new
project request.

4. The UAT team manager verifies that the changes are clearly and specifically described, returning any
unclear requests to the user for clarification. The requests are then forwarded to the development
team.

Certification/Non-Certification Memo

When the release has been completely tested, the User Acceptance Test leader will provide a
Certification/Non-Certification Memo. See the Certification/Non-Certification Memo form.

The release certification will include an overview of the product, certification statement and any required
statements of major concern or issues and an install recommendation.

The memo should also contain adjustments made due to hand-off problems, delays, or constraints along
with a statement of the criteria or functionality expected.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  67
4/19/04
2.10 Post Implementation Review

Post-Implementation Review Process

When a major project has been in a production environment for a period of time, generally three to six
months, a post implementation review should be conducted. The more “hard” information available for
this review, the better. Therefore, the decision to hold the review is best made as part of the initial
planning of the project, agreed to by Technology and UAT. This will help enable the collection of useful
information during the initial implementation period, particularly if this data is not normally collected.

A decision to hold a review, e.g., by managment directive, can also be made at any time after
implementation. In this instance it would be best to allow for a period of operation during which
appropriate data can be collected, again, if it is not normally available. In some cases, it will be possible
to obtain data by researching the operational period that has already elapsed.

The responsible parties identified by the Product Development Life Cycle - Technology, UAT and the
user, meet to discuss and assess the actual features and benefits realized versus those anticipated, as
well as any “lessons learned.”

At this time the financial impact and implications should also be reviewed, e.g., actual versus anticipated
costs and tangible benefits.

User Acceptance Testing Life Cycle Version 2.0 Chapter 2: UAT Functions  68
4/19/04
3 Forms

3.01 Project Initiation and Control

PDLC Checklist

This form is a combination of all sections of the PDLC Description shown in Section 4.02, available as a
file attachment.

User Acceptance Testing Life Cycle Version 2.0 Chapter 3: Forms  69


4/19/04
3.02 Documentation Control

Documentation Check List

DOCUMENTATION DATE UAT COMMENTS


REC’D INITIALS
Project Request Form

Business Requirements Document

Functional Specifications

Technical Specifications

Program Specifications

Specifications for Interdependent


Systems

Unit Test Plan

Unit Test Cases/Scripts

Unit Test Results

System Test Plan

System Test Cases/Scripts

System Test Results

Programs/JCL, etc.

User Acceptance Test Plan

User Acceptance Test Cases/Scripts

UAT Results/TRs

Release Information Bulletin

Runbook/Operational Documentation

User Documentation

User Acceptance Testing Life Cycle Version 2.0 Chapter 3: Forms  70


4/19/04
Documentation Control Log

PROJECT NUMBER DATE REMOVED NAME DATE RETURNED

User Acceptance Testing Life Cycle Version 2.0 Chapter 3: Forms  71


4/19/04
3.03 Problem Reporting

Requirements Trouble Report

This form is currently being developed.

User Acceptance Testing Life Cycle Version 2.0 Chapter 3: Forms  72


4/19/04
System Trouble Report

This form is composed in online using a button on the Test Script form.

User Acceptance Testing Life Cycle Version 2.0 Chapter 3: Forms  73


4/19/04
Problem Tracking System
Report #:
Status:

TESTER: PROBLEM/CHANGE REQUEST DESCRIPTION

Reported By: Project:

Reported Date: Type: Look & Feel:

Priority: ID/Cust/Reg/Acct:

Release: Category:

Reported in: Occurred Date:

Assigned To: Assigned Date:

ID Type: Region: Platform: Enc:

Test Summary:

Problem Summary:

Problem Description:

Supporting Documentation:
Save STR Print STR
DEVELOPER: RESOLUTION DESCRIPTION

Name:

Resolution Analysis:

Resolution Type: Scheduled Resolution Date:

Include in: Actual Resolution Date:

Release Notes:

QA/BAT: CLOSURE

Create Test Completion Report


Tested By:

Date Reopened: Number of Reopens:

Date Closed:

Comments:

Modified By: Last Modified:

User Acceptance Testing Life Cycle Version 2.0 Chapter 3: Forms  74


4/19/04
3.04 System Approval and Release

Certification/Non-Certification Memo

MEMORANDUM TO: Lead Project Manager/Business Sponsor

Technology Project Manager

FROM: UAT Project Manager

SUBJECT: Certification/Non-Certification for (Project Request Number)

DATE: Month, Day, Year

---------------------------------------------------------------------------------------------------------------------

This will serve as the concurrence memo to certify release _____(release number) _____ that contains
the following enhancements. Online test reports and actual STRs are available upon request.

Functional and regression testing have been completed.

The following System Trouble Reports have been resolved:

The following System Trouble Reports remain outstanding:

All testing was performed with the following exceptions:

The release of software can be placed into production. UAT no longer supports previous versions of this
software.

UAT: ________________________________

Sponsor: ________________________________

Based upon the above outstanding issues, UAT does not recommend that this release of software be
placed into production.

UAT: ________________________________

Sponsor: ________________________________

User Acceptance Testing Life Cycle Version 2.0 Chapter 3: Forms  75


4/19/04
4 Project Development Life
Cycle (PDLC)

This chapter has minor revisions to include early test planning.

4.01 PDLC Flow Diagram

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  76


4/19/04
PDLC FLOW DIAGRAM Part 1: Procedures 1-5

IMPLEMENTATION

DISTRIBUTION
SERVICES
UAT

LOG-IN
PRELIMINARY
TECHNOLOGY CONTROL
SIZING & SCOPE
SERVICES (TBIS )
(2A)
(2)
BRD
REVIEW
DATA CENTER [LEGAL/
PRE MEP
SERVICES HIGH LEVEL AUDIT
CONCEPT DEVELOPMENT
BRD* OR TEAM DETAILED REVIEW]
DEFINITION &
PRF REVIEW BRD (5)
(1) APPROVAL
(1A) (3) (4A)
PROCESS
CGIN (4)

PRIORITY
GLOBAL REVIEW
CAPABILITIES/ PRF’S
MARKETING (2B)

MARKETING
SUPPORT

TRAINING

SALES

*FORMAL CHANGE CONTROL PROCESS WILL EXIST TO UPDATE BRD AT ANY TIME DURING PROJECT LIFE CYCLE.

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  77


4/19/04
PDLC FLOW DIAGRAM Part 2: Procedures 6-12B

IMPLEMENTATION

F
U SOFTWARE
SPECIFICATION USER ACCEPTANCE
DISTRIBUTION USER ACCEPTANCE N USER ACCEPTANCE
CHANGE CONTROL TEST CASES,
TEST INITIATION C TEST PLAN
SERVICES SCRIPTS, DATA
(6B) T GO BACK TO STEP #6 (8F)
UAT (11A)
I (8D)
O
FUNCTIONAL FINAL
TECHNOLOGY N EXTERNAL DESIGN/
SPECIFICATIONS/ PROGRAM DATA CENTER/CGIN
SERVICES A SPEC/PROTOTYPE
HIGH LEVEL SPECIFICATIONS CAPACITY,
L (8)
DESIGN (11) PERFORMANCE,
(6) CONTINGENCY FINAL
S
P DATA CENTER PLANS
DATA CENTER
SERVICES E DEVELOPMENT/ MEP
C CAPACITY PLAN MEP PURCHASE INITIATION
CONVERSION FINANCIAL TEAM
(8C) (12)
STRATEGY REVIEW &
R PLAN
CGIN DEVELOPMENT/ (8G) SIGN-OFF
CGIN NETWORK E (9)
CGIN CAPACITY PLAN (8E) (10)
SPECIFICATION V PRODUCT
(6A) I DOCUMENTATION
E & MARKETING
USER ACCEPTANCE USER ACCEPTANCE
GLOBAL W COLLATERAL
TEST PLAN TEST CASES,
CAPABILITIES/ / (12A)
PRELIMINARY REVIEW SCRIPTS, DATA
S
MARKETING CUSTOMER (8H) REVIEW (11B)
I LEGAL
ROLL-OUT
G CONTRACTS
STRATEGY
N
(8A)
MARKETING -
SUPPORT O TRAINING PLAN
F (12B)
F
(7) TRAINING
TRAINING NEEDS
ANALYSIS
(8B)

SALES

*LEGAL/PROCESS STEPS NOT NOTED.

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  78


4/19/04
PDLC FLOW DIAGRAM Part 3: Procedures 12C-17

IMPLEMENTATION
IMPLEMENTATION PLAN
(12D)

PRIOR TEST
DISTRIBUTION OPERATIONAL RESULTS
SERVICES PROCEDURES
REVIEW
UAT (12C)
(16A)

PROGRAMMING/ DETAILED
TECHNOLOGY UNIT TEST SYSTEMS/
SERVICES (14) NETWORK INTERSYSTEM
TEST PLAN (15) MIGRATION
DETAILED COORDINATION
TECHNICAL
SYSTEMS TEST (17)
SPEC &
DATA CENTER EXECUTION
INTEGRATED
SERVICES (16)
OPERATIONAL PROJECT
PROCEDURES PLAN
(12C)
TEAM
CGIN REVIEW/
SIGN-OFF
(13)
CONTRACT
GLOBAL DEVELOPMENT
CAPABILITIES/ (IF REQUIRED)
INTERNAL
MARKETING (16B)
COMMUNICATION/
NOTIFICATION
(14B)
MARKETING
SUPPORT

TRAINING
TRAINING COURSE
DEVELOPMENT
(14A)

SALES

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  79


4/19/04
PDLC FLOW DIAGRAM Part 4: Procedures 18-24A

IMPLEMENTATION

USER TEST RESULTS ANALYSIS ERRORS/ U


DISTRIBUTION ACCEPTANCE TEST (SOFTWARE & ISSUES A
SERVICES EXECUTION PRODUCTION) IDENTIFIED REPROGRAM/ T
UAT (19) (20A) (21) REGRESSION TEST/
ACCEPT A
(GO BACK TO STEP #14) C
TECHNOLOGY (22) C AUDIT
SERVICES E REVIEW
AUDIT P &
REVIEW T SIGN-OFF
(18) A
INSTALL (24)
DATA CENTER N
HARDWARE/
SERVICES PRODUCTION C
NETWORK
ACCEPTANCE E
OPERATING/
TEST (IF REQUIRED)
CONTINGENCY
(20) &
CAPACITY
CGIN (19A)
U
S
E
GLOBAL R
CAPABILITIES/
RECEIVE CUSTOMER
MARKETING DOCUMENT- S
DRAFT MATERIALS
ATION I
MARKETING PRINTED
FEEDBACK G
MATERIALS (24A)
(21A) N
MARKETING (19B)
|
SUPPORT O
F
CLASSROOM F
TRAINING (23)
TRAINING
BEGINS
(20B)

CONTRACT
DELIVERY &
SALES EXECUTION
(19C)

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  80


4/19/04
PDLC FLOW DIAGRAM Part 5: Procedures 25-32

PC PC ALP HA/BETA/
INSTALLATION/ P ARALLEL/
IMPLEMENTATION
CONTINGENCY PILOT
P LAN (29) (30)

SOF TWARE RELEASE


DISTRIBUTION MAINF RAME/NETWORK
CONTROL PROCESS
SERVICES ALP HA/BETA/ C
(25) “CHANGEMAN” P
UAT P ARALLEL/PILOT U
SIGN- OF F / (30A) O
S
SCHEDULE S
T
MOVE/ T
TECHNOLOGY O
COMP LETE
SERVICES M
RELEASE I
E
CONTROL M
DATA R
(26) P
CENTER DATA MAINF RAME MAINF RAME/NETWORK
L
DATA CENTER CHANGE CENTER INSTALLATION/ ALPHA BETA R
E
SERVICES MEETING TURNOVER CONTINGENCY P ARALLEL P ILOT O
M
(27) (28) PLAN (29A) (30A) L
E
L
N
CGIN |
T
TURNOVER O
CGIN A
(28A) U
T
T
I
/
O
GLOBAL F
N
CAPABILITIES/ E
MARKETING INTERNAL/ E
R
EXTERNAL D
E
COMMUNICATIONS
V
(25A) B
MARKETING I
A
SUP PORT E
C
W
K
(32)
(31)
CLASSROOM ON- JOB
TRAINING TRAINING TRAINING
COMP LETED SUP PORT
(29B) (29C)

SALES

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  81


4/19/04
4.02 PDLC Description

PDLC DESCRIPTION Part 1: Procedures 1-5

PROCEDURE RESPONSIBILITY RATIONALE & DATE COMPLETED


DESIRED OUTPUT OR EXPLANATION

1. Define Concept Representatives from all Gain agreement on


functions as appropriate* project goals to enable
High Level Business
Requirements Document
to be developed.
1A. Create High Representatives from any High Level Business and
Level Business department (project Cost Justification;
Requirements sponsor/submitter)* feasibility & preliminary
Document/ approval for New
Project Development or
Request Form Enhancement.
Preliminary cost/benefit.
2. Log in Control Technology Business Provide project request
Number Integration and Support number on project
request systems.
2A. Preliminary Technology Services Detail scope/sizing effort;
Sizing & Scope number of "man-days"
of project needed; & feasibility of
change.
2B. Priority Review Global Assigned priority for
for PRF's Capabilities/Marketing project.
3. Present for Representatives from all Decision on priority; est.
Team Review functional areas* time frames for
completion of
development through
systems tests. Define
project team.
4. Pre-MEP Designated participants Decision on MEP;
Development & preparation and
Approval responsible parties. A
Process "Building License" is
granted if appropriate.
"Seed money" is provided
to create functional
specs.
4A. Develop Sponsor with assistance Detailed Business & Cost
Detailed from functional areas Justification, feasibility &
Business Preliminary approval for
Requirements New Development or
Document Enhancement.
Preliminary cost/benefit.

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  82


4/19/04
Signoffs from functional
areas.
5. BRD Review Sponsor/Dist. Svcs. Risk Analysis; System
[Legal/Audit UAT/Technology synopsis. Preliminary
Approval for Services/Data audit approval is granted
major Center/CGIN./Global as appropriate.
development Capabilities/ Marketing/
only] Legal/Audit

* Project Manager will assume lead role unless otherwise agreed upon by project team.

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  83


4/19/04
PDLC DESCRIPTION Part 2: Procedures 6-12B

PROCEDURE RESPONSIBILITY RATIONALE & DATE COMPLETED


DESIRED OUTPUT OR EXPLANATION

6. Functional Representatives from all The "Whats" & High


Specifications/ functional areas; the Level "Hows" of the
High Level sponsor may elicit requested change;
Design support in order to method of implementing
complete document.* the detailed.
enhancement, change or
product
6A. CGIN Network
Specification

6B. User Distribution Services Initial scoping and


Acceptance UAT planning of UAT.
Test Initiation

7. Functional Representatives from all Review of the document


Specifications functional areas* to ensure that all areas
Review/Sign- are included.
off
8. External Technology Services Creation of a working
Design model and very detailed
Specification/ design of new
Prototype development effort.
8A. Preliminary Marketing Support/Global Plan for installation of
Customer Roll- Capabilities/Marketing change to user
Out Strategy community.
8B. Training Needs Training Requirements,
Analysis methodology and cost
assessment for teaching
the end user community
new functions or system.
8C. Data Center Data Center Services Analysis of impact of
Development/ change on
Capacity Plan infrastructure/resources/
capacity.
8D. Software Dist. Svcs. UAT/Tech. Identify open items or
Specification Svcs. issues; submit change
Change request/modified spec.
Control (As appropriate).
Process
8E. CGIN CGIN Analysis of change on
Development/ infrastructure/resources/
Capacity Plan capacity
8F. User Distribution Services Strategy to test
Acceptance UAT conformance to business
Test Plan requirements and

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  84


4/19/04
expectations.
8G. Conversion Data Center Services, Strategy to perform
Strategy CGIN conversion. Includes
parallel strategy.
8H. User Global Review of test plan by
Acceptance Capabilities/Marketing users.
Test Plan
Review
9. MEP Financial Representatives from all Development of the
Plan functional areas* financial plan and
justification for proposed
development effort.
10. MEP Team Representatives from all Project Review/Sign-off
Review & functional areas* on Project preliminary
Sign-Off plan. Go-No Go decision
on project. Functional
should be "frozen"
Approval of major
expenditure proposal. (If
appropriate).
11. Final Program Technology Services Preparation of Program
Specifications Specification and Test
Cases; includes
programs, procedures
and mechanics of
change.
11A. User Distribution Services Test cases, scripts and
Acceptance UAT data for user acceptance
Test Cases, test.
Scripts, Data
11B. User Global Review of test cases,
Acceptance Capabilities/Marketing scripts and data by users.
Test Cases,
Scripts, Data
Review
12. Data Data Center (Mainframe Plan for capacity;
Center/CGIN Development)/CGIN analyses of effect on
Capacity- performance; and
Service disaster recovery in the
Performance & event of system or
Contingency component failure. A test
Final Plans & plan for the integration of
Purchase hardware/firmware/
Initiation systems software as
appropriate for effort.
Identification of
appropriate resources.
12A. Begin Product Global Capabilities/ Prepare preliminary plan
Documentation Marketing/Marketing for the production of
& Marketing Support/Legal guides/aids using
Collateral & requirements and
Legal functional spec.
Contracts (Promotional materials or
aids may be included).
12B. Training Plan Training Plan preparation and

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  85


4/19/04
assessment of
requirements; training
manuals, guides, etc. and
logistical needs.
Assemble additional
information from
functional, programming,
test plans to be factored
into training program.

* Project Manager will assume lead role unless otherwise agreed upon by project team.

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  86


4/19/04
PDLC DESCRIPTION Part 3: Procedures 12C-17

PROCEDURE RESPONSIBILITY RATIONALE & DATE COMPLETED


DESIRED OUTPUT OR EXPLANATION

12C. Operational Distribution Services Procedures that will need


Procedures UAT, CGIN, Data Center to be changed or
implemented as a result
of project
implementation.
12D. Implementation Implementation Completed Plan
Plan
13. Technical Representatives from all Final Review before
Specification & functional areas* coding effort to ensure
Integrated that all outstanding
Project Plan issues are resolved;
Team Review/ Technical spec is
Sign-Off "frozen". Integ. Proj. plan
is presented; Program
Mgr. designated; Sign-
Offs occur.
14. Programming/ Technology Services Actual coding and
Unit Test documentation of
programming changes;
testing of each
path/routine and new or
modified programs. Any
combinations of
paths/routines etc.
14A. Training Training Assess additional
Course information derived from
Development documents and reviews
to prepare course of
training for new function
or development.
14B. Internal Marketing Support Creation/distribution of
Communi- information regarding
cation/ change(s) to internal
Notification staff.
15. Detailed Technology The testing of all new
Systems/ Services/CGIN systems interfaces and
Network Test the effect of change on
Plan the integration of
code/firmware to existing
system/platform.
16. Detailed Technology Perform Test; review test
Systems Test Svcs./CGIN/Data Center results and
Execution Services documentation; prepare
for turnover to Business
Testing. Conduct review

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  87


4/19/04
with Audit if necessary.
16A. Prior Test Distribution Services Review of all prior test
Results Review UAT results prior to User
Acceptance Test
Execution.
16B. Contract Global Creation of agreement
Development Capabilities/Marketing/ between bank and
(if required) Marketing customer covering
Support/(Legal) product service and
delivery parameters.
17. Intersystem Dist. Svcs. UAT/Tech. Movement of source to
Migration Svcs./Data Center Svcs. environment where
Coordination business simulation tests
begin. (QA Region) New
software is incorporated
into test systems. Micro
system diskettes and
release -documentation
are provided.

* Project Manager will assume lead role unless otherwise agreed upon by project team.

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  88


4/19/04
PDLC DESCRIPTION Part 4: Procedures 18-24A

PROCEDURE RESPONSIBILITY RATIONALE & DATE COMPLETED


DESIRED OUTPUT OR EXPLANATION

18. Audit Review & Dist. Svcs. UAT/Tech. Audit Review of final test
Sign-Off Svcs./Data Center plan.
Svcs./CGIN (If required).
19. User Distribution Services Test to ensure that
Acceptance UAT product meets
Test Execution performance objectives
for functionality and
usability. Connectivity is
included as appropriate.
19A. Install Data Center Svcs./CGIN Install required
Hardware/ component(s) for
Network production effort . New
Operating/ hardware, systems
Contingency software, firmware,
Capacity network etc.
19B. Receive Draft Marketing Support/Global Proofs for review of
Marketing Capabilities/Marketing materials/aids.
Materials. Documents are provided
for tests.
19C. Contract Sales Contract is delivered,
Delivery & negotiated and signed.
Execution
20. Production Data Center Svcs./CGIN Test of component(s) for
Acceptance required criteria; meets
Test (if integration and standards
required) benchmarks.
20A. Test Results Dist. Svcs. UAT/CGIN/ Test findings assembled;
Analysis Global studied.
(software and Capabilities/Marketing Recommendations are
production) developed.
20B. Classroom Training Implementation of
Training Training plan; classes
Begins begin.
21. Errors/Issues Dist. Svcs. UAT/ Errors and Issues are
Identified Marketing Support/ addressed/prioritized.
Global (Use online database).
Capabilities/Marketing
21A. Documentation Global Provide feedback on
Feedback Capabilities/Marketing/ documentation re: errors.
Marketing Support
22. Reprogram- Distribution Services Fixes are resolved and
ming for UAT/ Technical Svcs. re-implemented into
Errors/ source.[return to step
Regression #16/unit tests].
Testing until Acceptance tests resume

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  89


4/19/04
Acceptance to confirm issues are
resolved to meet
requirements.
23. Final UAT & Dist. Svcs. UAT/Tech. Sign-off is obtained from
User Svcs./CGIN/Marketing UAT and Primary Cash
Acceptance Support Applications Owners in
Sign-off. Distribution Services.
Product/changes must
meet criteria;
issues/modifications
defined. [see change
control # 10]
24. Audit Review & Dist. Svcs. UAT/Tech. Review of project; all
Sign-Off Svcs./Data Center documentation and test
Svcs./CGIN results; demos if needed
or appropriate.
24A. Customer Global Capabilities/ Customer
Materials Marketing/Marketing Aids/documentation
Printed Support finalized and shipped to
Product Mgmt.

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  90


4/19/04
PDLC DESCRIPTION Part 5: Procedures 25-32

PROCEDURE RESPONSIBILITY RATIONALE & DATE COMPLETED


DESIRED OUTPUT OR EXPLANATION

25. Software Dist. Svcs. UAT Software is scheduled for


Release production; for micro
Control products, upgraded
Process diskettes and load
Begins procedures are prepared
for distribution.
25A. Internal/ Global Capabilities/ Marketing, training,
External Marketing/Marketing administration,
Communica- Support communications and
tions publicity procedures and
processes are
completed.
26. “Changeman” Dist. Svcs. UAT/Tech. Electronic sign-
Sign-Off/ Svcs./Data Center off/notification of release
Schedule Svcs./Global date.
Move/Complete Capabilities/Marketing
Release
Control
27. Data Center Technology Svcs./Data Changes are discussed,
Change Center Svcs./CGIN accepted or rejected.
Meeting Data Center
documentation;
Procedures and
contingency plans are
turned over and
accepted. Operating
contingencies are
addressed and updated.
28. Data Center Data Center Operating procedures.
Turnover
28A. CGIN Turnover CGIN Operating Procedures.

29. PC Installation/ Implementation PC software loaded on


Contingency appropriate
Plan Bernoulli/Cartridge.
Beta/Parallel begins
and/or Roll out Process
begins.
29A. Mainframe Data Center Svcs. Change is installed with
Installation/ production version
Contingency software.
Plan
29B. Classroom Training All training sessions have
Training been completed.
Completed

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  91


4/19/04
29C. On-Job Training Process in place for
Training responding to start-up
Support training needs.
30. PC Alpha/Beta/ Implementation Production test against
Parallel/Pilot specific criteria; "burn-in"
of new software/system/
product in true production
or customer environment.
Test assumptions/
usability of screens and
customer aids.
30A. Mainframe/ Dist. Svcs. UAT/Data Production test against
Network Center specific criteria; "burn-in"
Alpha/Beta/ of new software/
Parallel/Pilot system/product in true
production or customer
environment. Test
assumptions/usability of
screens and customer
aids.
31. Customer Roll Representatives from all Software is installed on
Out/ Feedback functional areas* customer PC; or for
mainframe, software
access entitlement is
granted or user has
access upon release into
production environment.
32. Post Tech. Svcs./Dist. Svcs. 90 to 180 day post review
Implementation UAT/Global Capabilities/ against features and
Review Marketing. benefits realized or
"lessons learned".
Financial impact and
implications are reviewed.

* Project Manager will assume lead role unless otherwise agreed upon by project team.

User Acceptance Testing Life Cycle Version 2.0 Chapter 4: PDLC  92


4/19/04
5 Attachments

5.01 User Acceptance Test Plan Template and Sample

User Acceptance Test Plan Template

The Test Plan Template is stored in a separate document.

Also provided is the MSProject Work Plan corresponding to the Sample Work Breakdown Structure at the
end of Section 2.02.

User Acceptance Test Plan Sample

The Test Plan Sample is stored in a separate document.

Also provided is the MSProject Work Plan for the above test plan.

User Acceptance Testing Life Cycle Version 2.0 Chapter 5: Glossary, Revs., Concurrence  93
4/19/04
6 Glossary, Revisions and
Concurrence

5.01 Glossary

Glossary of Testing Terms

BUILD

A logically complete subset of a system that can be tested independently of the rest of the application,
and subsequently integrated with other builds. The builds may be installed in production successively
or in groups, rather than all at one time.

The build may also represent a group of test runs that test a given subset of the system functionality,
even if the system is tested all at one time.

BUILD STRATEGY

The method of determining how the tests and, ideally, the system, should be divided into builds.

CHANGE CONTROL

The discipline that controls all changes and ensures that associated documents such as user manuals,
test plans, requirements specifications and design specifications are kept current with software.

CHANGE CONTROL PROCEDURE

The procedure that documents requested changes to a system, identifies system components affected
by change, obtains necessary approvals and prioritizes changes, tracks changes through the
development process, and provides an audit trail of all changes by system component.

User Acceptance Testing Life Cycle Version 2.0 Chapter 5: Glossary, Revs., Concurrence  94
4/19/04
COMPONENT

An object needed for the testing of an application such as a program, database file, user guide, etc.

COVERAGE

The degree to which a set of tests validates a given set of requirements.

DELIVERABLE

Any product of the development, test planning or test execution process, such as a document or
software module.

LEAF-LEVEL REQUIREMENT

The lowest level of detailed requirements, i.e., a requirement that cannot be decomposed further.
Requirements should be decomposed until the leaf level requirements are sufficiently detailed to be
testable.

LOGICAL DAY

A period of time during which the date used by the system must remain unchanged to simulate a
business day.

MILESTONE

An event representing the completion of a set of related tasks. A milestone is represented by a single
date, and has no resources associated with it.

PARALLEL TEST

The process of exercising a system, or some of its functions, in its old (current) and new (replacement)
version with the same data, or exercising a system along with a previous system or process, then
comparing the results.

PILOT TEST

The process of exercising a system in a production environment that is physically complete, including
manual procedures and real data, but limited in scope (e.g., to a single user location) to ensure that the
application system works as a whole, before it is implemented on a wider basis.

User Acceptance Testing Life Cycle Version 2.0 Chapter 5: Glossary, Revs., Concurrence  95
4/19/04
REGRESSION TEST

Re-execution of test cases to insure that

 All errors have been removed. (Tests that previously failed should now execute correctly.)

 All changes and enhancements have been made correctly. (Some tests executed previously will
give different results. New tests will be required for enhancements.)

 No new defects have been introduced. (Tests that previously executed correctly should still
execute correctly.)

REQUIREMENTS

Requirements represent what an application is supposed to do and the standards against which tests
must be measured; therefore, all test plans are based on requirements. There are different types of
requirements, such as user, technical, audit and control, corporate standards, and operational, with the
most important being user requirements. The primary objective of any system is to satisfy user
requirements. Requirements are drawn from specified deliverables such as the Business
Requirements Document and the Functional Specifications.

REQUIREMENTS DEFECT

An omission, ambiguity, inconsistency, standards violation, or other problem in a


requirements document that, if not corrected, will lead to a corresponding defect in
the code.

REQUIREMENTS VALIDATION MATRIX

Cross references between the system requirements and the builds, test run and test case
specifications that ensure that all requirements have been adequately tested.

REQUIREMENTS PROBLEM REPORT

A formal report of a defect or suggested change in requirements.

SYSTEM TROUBLE REPORT

A formal report of a problem or suggested change in code under test.

TASK

An identifiable unit of work, having start and end dates and requiring specified resources. Tasks may
be broken down to multiple levels, possibly using different terms for each level.

TEST CASE

User Acceptance Testing Life Cycle Version 2.0 Chapter 5: Glossary, Revs., Concurrence  96
4/19/04
The smallest unit of work in the testing process producing results that are predictable and verifiable.

TEST PLAN

A document describing the approach to performing any level of testing in a given project,
e.g., User Acceptance Testing, developed before the test cases/scripts for that level of
testing.

TEST RUN

A series of related test cases that test a specific set of system requirements.

TEST SCRIPT

As used in the Citibank Global Cash Management UAT groups, a subdivision of a test case that
typically contains detailed action steps or specific data.

USER(S)

Unless otherwise specified, the internal Citibank sponsors/requesters of a system or


project.

USER ACCEPTANCE TEST PLAN

A document defining the criteria to be met for user acceptance of a system, detailing the strategy and
listing the test cases to be used in acceptance testing.

USER ACCEPTANCE TESTING

The final formal level of testing for a system in which the users determine if a system operates
according to its basic overall business requirements. The goal is to uncover areas where the system
does not meet user expectations.

USER ACCEPTANCE TESTING (UAT) PROJECT LEADER

The individual designated to lead that portion of a development or maintenance


project that is carried out within the UAT organization - primarily the planning and
execution of User Acceptance Tests.

WORK PLAN

The part of a test plan that details the organization/responsibilities, work breakddown

User Acceptance Testing Life Cycle Version 2.0 Chapter 5: Glossary, Revs., Concurrence  97
4/19/04
structure (tasks and milestones), schedule and resources required to accomplish all phaes of
test planning and execution.

WORK PRODUCT

Any deliverable of the project life cycle. The destination, audience or users of the
deliverable may be be within Citibank’s Technology, UAT or sponsor organizations (e.g., for
a Business Requirements Document) or at the external end user organizations (e.g., PC
code provided on diskette).

User Acceptance Testing Life Cycle Version 2.0 Chapter 5: Glossary, Revs., Concurrence  98
4/19/04
5.02 Revisions

Revision History/Plan

VERSION NO. DATE DESCRIPTION

1.0 Original version of the UAT Life Cycle

1.1 Minor revisions.

1.2 02/28/96 Minor revisions.

2.0 TBD Major revision incorporating several new procedures to


complete the description of the UAT methodology, as well as
the Road Maps. New procedures are:

 Requirements/Design Change Management

 Test Planning and Management

 UAT Estimating

 Requirements Validation and Defect Reporting

as well as the User Acceptance Test Plan Template and


Sample.

This version is the first to be placed in an online database.

User Acceptance Testing Life Cycle Version 2.0 Chapter 5: Glossary, Revs., Concurrence  99
4/19/04
5.03 Concurrence

Concurrence of UAT Management

We have concurred by signature that these procedures will be followed. Any exceptions will be
documented and approved by management.

_______________________ _______________________
John V. Elefante Anthony R. Fusco

_______________________ _______________________
Pat D’Agnese Caryl Leong

_______________________ _______________________
Mary V. Erdman Felix Litvinsky

User Acceptance Testing Life Cycle Version 2.0 Chapter 5: Glossary, Revs., Concurrence  100
4/19/04

Das könnte Ihnen auch gefallen