Sie sind auf Seite 1von 26

{Company Name}{Department Name} {Name} System [Version|Release m.

n] User Acceptance Test Plan

Written by: Organization: Location: Original Publication Date: Current Plan Version: Current Publication Date:

{Name(s)} {Name} {Location} {mm/dd/yy} {m.n} {mm/dd/yy}

How to Use the Template


The Template provides a shell for a User Acceptance Test Plan. The template contains specifications of the information to be filled in, as follows: Text in normal type is intended to be used as is. Text in italic type is either a description or placeholder to be replaced with the information described, e.g., replacing describe briefly with an actual description. instructions to be deleted once they have been followed, e.g., select one of the following alternatives. {Text in braces - normal or italic} specifies required information. [Text in brackets - normal or italic] specifies optional information. The vertical bar within braces or brackets specifies a choice: either this|or that.

To prepare the plan, fill in the information specified within the Template, as described above, guided by the UAT methodology. Change the footer to reflect the plan being produced, with its version and date. The Test Plan Sample provides an example of a test plan following the template. The text may also be varied as required, as long as the underlying approach is followed. Familiarity with Word 6.0 is assumed. The following are some key tips: Use Edit/Find to locate all occurrences of { and [. If not familiar with using styles, to get a heading of any level, copy an existing one and insert it where the new heading is needed. The number will be updated automatically. Then change the text as needed. Update the Table of Contents at any time by placing the cursor in it and hitting F9. The entire Table of Contents should be updated, not just the page numbers, if you have added or changed headings at Level 1 or 2. These instructions are placed within the Template for convenience of reference. Delete this entire page when finished, and do a final update of the Table of Contents.

User Acceptance Test Plan Template Version 1.0 7/20/09

Table Of Contents
1. Introduction....................................................................................................................4 2. Strategy..........................................................................................................................8 3. Work Plan....................................................................................................................15 4. Testing Procedures.....................................................................................................18 5. Test Case Design........................................................................................................20 6. Test Case Execution....................................................................................................21 Attachment A: Requirements Hierarchy........................................................................23 Attachment B: Requirements Validation Matrices..................................................................................24 Attachment C: Work Plan..............................................................................................25 Attachment D: Test Case Design..................................................................................26

User Acceptance Test Plan Template Version 1.0 7/20/09

1. Introduction
This Test Plan describes how and when User Acceptance Testing will be performed to ensure that [version|release m.n of] the {name} System performs according to its business requirements and functional specifications. The business purpose of the {name} System is {brief statement of overall business functionality}. The business owner of this system is {name, title, location, phone}. The objectives of the current {version|release|project} are: {list/describe briefly}.

The business justification is {describe briefly}.

1.1. Document Structure


Section 1 continues below to describe the purpose and scope of the test, the assumptions underlying the plan, and the risks that may impede testing or implementation. Section 2 details the strategy for the user acceptance test. Section 3 provides the work plan for the test. Section 4 includes the procedures used to control the test. Section 5 lists the test cases to be executed. Section 6 lists the components involved in the test. The Glossary in the Appendix defines key testing terms. [The attachments contain the following parts of the plan that are lengthy and/or are prepared by software other than word processing:] {Select/change the items in the following list as needed. Note if item is stored separately rather than as part of this document.} Attachment A: Requirements Hierarchy Attachment B: Requirements Validation Matrices

User Acceptance Test Plan Template Version 1.0 7/20/09

Attachment C: Work Plan Attachment D: Test Case Design]

1.2. Purpose and Scope of the User Acceptance Test


User acceptance testing is performed to verify that the total system, both software deliverables and associated non-software deliverables (documentation, forms, procedures, etc.), will function successfully together in the business environment and will fulfill user expectations as defined in the business requirements and functional specifications. User acceptance testing normally comprises the final set of tests to be performed on the system or release. This acceptance test will be {carried out|coordinated} by the {name} Business Acceptance Testing group located at {location}. The {name(s)} user organizations located at {location(s)}, respectively will participate as follows: {Describe involvement in test planning, test case development and/or test execution, including reviews and signoffs} 1. 2. The acceptance test will begin on {mm/dd/yy} and be completed by {mm/dd/yy}.

1.3. Assumptions
Prior to acceptance testing, the tests listed below will have been performed, and the results reviewed by the User Acceptance Testing group. These tests are considered to have been satisfactorily completed, with the exceptions noted below. {To be completed in updating the plan prior to acceptance test execution.} 1. Unit testing by the {name} group at {location}. Exceptions: {None|describe briefly and reference detail}. 2. Integration testing by the {name} group at {location}. Exceptions: {None|describe briefly and reference detail}.

User Acceptance Test Plan Template Version 1.0 7/20/09

3. System testing by the {name} group at {location}. Exceptions: {None|describe briefly and reference detail}. 4. [Other, if any. Exceptions: {None|describe briefly and reference detail}.] This test plan describes all of the remaining tests to be performed on the {system| release} [except for describe any subsequent tests not covered by this plan and reference the appropriate test plan(s)]. An adequate testing environment at {location}will be available beginning {n} days before the start of acceptance testing for setup and shakedown. {Select one of the following two items.} The software and non-software deliverables will be made available to the UAT group in builds, as detailed in the Build Strategy Section below. The entire system will be made available to the UAT group at one time on or before the above start date. Hence the term build as used in this plan refers only to the top level of the test hierarchy structure. Adequate staff resources with appropriate knowledge and/or training will be available as specified in the work plan during the period required to set up and complete the test. {Insert any other assumptions as appropriate.}

1.4. Risk Assessment


Significant risks that are inherent in the system design or environment, or result from the selected approach to development or testing are listed below. The Severity level (High, Medium, Low) represents the overall significance of the risk, i.e., the combination of its likelihood of occurrence and its degree of impact if it occurs. Risks that may impact the ability to complete the acceptance test successfully and on time:

User Acceptance Test Plan Template Version 1.0 7/20/09

Description

Severity Avoidance/Minimization Approach

Risks that may impact the ability to install the system on time and/or operate the system successfully: Description Severity Avoidance/Minimization Approach

User Acceptance Test Plan Template Version 1.0 7/20/09

2. Strategy
This key section of the test plan describes the approach used to assure that the system is thoroughly tested. It details the levels and types of tests to be performed and the requirements to be tested, as well as the coverage and build strategies [and the approach to pilot and parallel testing]. Finally, it presents the validation matrices or equivalent that assure coverage of the requirements at all levels.

2.1. Test Levels


This acceptance test will include the following test level(s): {Delete any not applicable.} 1. Normal acceptance testing in the UAT environment at {location}. 2. Parallel testing against {release m.n of this system|the {name} system|the {name} process} in the {UAT|production} environment at {location}. 3. Pilot testing in production at {location(s)}.

2.2. Types of Tests


The following types of tests will be performed {delete any not applicable}: Environment tests: Tests that validate the functioning of the system with the hardware, system software and networks. Positive functional tests: Response to valid user actions and valid data. Negative functional tests: Response to user actions or data that should generate error messages. Invalid input tests: Response to inputs or user actions that may be unanticipated by the system design. Usability tests: Verification of the ease of use of the system and its associated documentation and procedures, including recovery procedures. Control tests: Tests of the systems ability to produce appropriate audit trails. Security tests: Tests of the systems ability to restrict access to data or functions. 8

User Acceptance Test Plan Template Version 1.0 7/20/09

Capacity/performance tests: Tests of the systems ability to handle specified volumes, or produce specified response time or throughput. Regression tests: Repeated tests in any of the above categories that verify that problems were fixed, or other changes were made, correctly and without adverse impact on other functions. [Other tests: Describe, if any.]

2.3. Requirements Identification


The requirements to be validated by this test plan originate in the following document(s): Title Version Date

{Select one of the following two items.} The requirements to be validated by this test plan have been identified and decomposed hierarchically, and are shown {below|in Attachment A}. The requirements to be validated by this test plan have been decomposed hierarchically and are stored in {tool or environment} on {workstation|server identification} as { drive:\directory path\file }. [A hard copy listing is found in Attachment A.] {Hard copy should be attached unless the access to the on-line material is available and familiar to all concerned.}

2.4. Requirements Coverage Strategy


{Select one of the following two paragraphs.} This acceptance test of a new system is designed to validate nnn% {normally 100%} of the requirements relating to the environment, the software deliverables, and the nonsoftware deliverables. {Describe any exceptions.} This acceptance test of corrections and/or changes is designed to validate nnn% {normally 100%} of the new and changed requirements relating to the environment, the software deliverables, and the non-software deliverables, as well as {nn%} of preexisting requirements via regression tests. {Describe any exceptions.} The coverage of requirements is verified by the {Select one of the following:} User Acceptance Test Plan Template Version 1.0 7/20/09 9

Requirements Validation Matrices shown in {Section 2.8|Attachment B}. {Name of report} produced by {tool or environment} on {workstation/server identification} from the input file {drive:\directory path\file}. [A hard copy listing is found in Attachment B.] {Hard copy should be attached unless the on-line material is available to all concerned.} {Other approach.}

2.5. Build and Test Run Strategy


The builds listed below represent the highest level logical grouping of tests [and the stages in which the system will be delivered to UAT] [as well as the stages in which the system will be installed in production]. The system components to be validated by these builds will be delivered to UAT {on the dates listed below|at the start of user acceptance test execution}. The test runs within each build are listed below. The system components in each build are listed in Section 6.1. Coverage of high-, intermediate- and detail-level requirements by the builds, test runs and test cases respectively is validated as described in Section 2.8. Build No. Build Name Software/NonSoftware Delivery Date

The sequence of builds reflects the following dependencies and other factors relating to the development and/or testing processes and the functionality to be delivered: {List factors leading to choice of build structure and sequence.}

{Include either or both of the following two items, if applicable.} Builds will be moved to parallel testing {individually|all at one time}. {Detail as required}. Builds will be moved to pilot testing {individually|all at one time}. {Detail as required}. {Reproduce either of the subsections below for each build. Build numbers should be unique through all builds.}

User Acceptance Test Plan Template Version 1.0 7/20/09

10

2.5.1.
2.5.1.1.

Software Build {n}: {name}


Description

{List functions included.} 2.5.1.2. Test Runs

The tests for this build will be divided into the following test runs, each generally representing a single on-line session or batch job unless otherwise described in the text: Run No. Name Description/Objective(s)

2.5.1.3.

Test Files

The following test files must be available to test this build: The names provide a reference to the Test Execution section below. Description Content* Source** Name

{*e.g., empty file, valid records, records processed by specified subsystem or function} {**e.g., from developers, create by specified tool} 2.5.1.4. Test Tools

The following tools will be used in testing this build. Tool Name Tool Type* Purpose/How Used Required?**

{*e.g., Test Management, Capture/Replay or Scripting, Test Data Generator, File Comparison, Interactive Debugging, other categories such as Simulation or Performance Monitoring} {**i.e., use of this tool is required by current standards/methodology}

User Acceptance Test Plan Template Version 1.0 7/20/09

11

2.5.2.
2.5.2.1.

Non-Software Build {n}: {name}


Description

This build includes the following component(s): {Name: description} Acceptance Criteria

2.5.2.2.

{List criteria, e.g., conformance to standards or specifications, readability/usability} Validation Method(s)

2.5.2.3.

{List approach(es), e.g., inspection, review, user test. If appropriate, list test runs as above.}

2.6. Parallel Test Approach


{Describe how parallel test will be executed and validated If possible, list test runs as above.}

2.7. Pilot Test Approach


{Describe how pilot test will be executed and validated If possible, list test runs as above.}

2.8. Requirements Validation Matrices


{Select one of the following two items.} The Requirements Validation Matrices are shown {below|in Attachment A}. The Requirements Validation Matrices or equivalent have been developed as the {name of report} produced by {tool or environment} on {workstation/server identification} from the input file {drive:\directory path\file}. [A hard copy listing is found in Attachment B.] {Hard copy should be attached unless access to the on-line material is available and familiar to all concerned.}

User Acceptance Test Plan Template Version 1.0 7/20/09

12

2.8.1.

High-Level Validation Matrix

The following matrix has been used in determining the build strategy and verifying coverage of high-level requirements by builds. An X under a build number indicates that the build tests the requirement at the left. High-Level Requirement Number Name Build Number

2.8.2.

Intermediate Level Validation Matrices

The following matrices have been used in determining the test run strategy and verifying coverage of intermediate-level requirements by test runs. An X under a test run number indicates that the test run tests the requirement at the left. {Repeat the following as required for each build, including any non-software builds that will be validated by actual testing.} Intermediate-Level Requirement Number Name Build n: Test Run Number

2.8.3.

Detail Level Validation Matrices

The following matrices have been used in determining the test case strategy and verifying coverage of detail-level requirements by test cases. An X under a test case number indicates that the test case tests the requirement at the left.

User Acceptance Test Plan Template Version 1.0 7/20/09

13

{Repeat the following as required for each test run.} Detail-Level Requirement Number Name Build n Test Run m - Test Case Number

User Acceptance Test Plan Template Version 1.0 7/20/09

14

3. Work Plan
This section details the organization of the user acceptance test team, the breakdown of the work into tasks and milestones and the resources needed to execute the tests.

3.1. Organization and Responsibilities


The User Acceptance Test team will be organized as follows: {The items indicated, or the equivalent, are required. Multiple responsibilities may be assigned to the same person where appropriate. Include all participating UAT and user staff members} Name Phone Responsibility Lead User Representative User Acceptance Signoff Development Project Leader/Liaison UAT Team Leader Test Plan Development Test Case/Script Development Test Execution

3.2. Major Tasks, Milestones and Target Dates


All tasks, milestones and target dates are listed in the Work Breakdown Structure (WBS). The following major milestones and target dates are shown here to facilitate management/user review. {The items indicated, or the equivalent, are required. Include UAT start and completion dates for each build and test level (e.g., normal UAT/Parallel/Pilot)} Milestone or Task(M/T): Name T: UAT Plan Development T: UAT Case/Script Development T: UAT Environment Setup M: Delivery of {Build 1|System} to UAT T: UAT Environment Checkout T: UAT Execution [Build 1] {Include any remaining builds
below.}

Start Date*

End Date

N/A

M: First Acceptance Transmittal to User M: Second Acceptance Transmittal to User (if needed)

N/A N/A

User Acceptance Test Plan Template Version 1.0 7/20/09

15

Milestone or Task(M/T): Name

Start Date*

End Date

{*Omit for milestones - see Glossary}

3.3. Work Breakdown Structure


{Complete either or both of the following two items. Include views showing the breakdown and schedule of tasks and milestones, the loading of resources, and the dependencies/critical path.} The detailed work breakdown structure is shown {below|in Attachment C}. The detailed work breakdown structure is stored in {tool or environment, e.g., MSProject} on {workstation/server identification} as {drive:\directory path\file }. [Hard copy listings are found in Attachment C.] {Hard copy should be attached unless access to the on-line material is available and familiar to all concerned.}

3.4. Resources Needed for Test Execution


The required configuration of the test environment is specified below.

3.4.1.
Type

Hardware
Quantity Date Needed Location(s)

Issues: {Describe any potential conflicts or other issues impacting hardware access/capacity/ availability, their impact, and how they will be dealt with, or indicate None.}

User Acceptance Test Plan Template Version 1.0 7/20/09

16

3.4.2.

Software/USERIDs/Access/Accounts
Date Needed Location(s)

Type and Version (where significant)

Issues: {Describe any potential conflicts or other issues impacting software availability, their impact, and how they will be dealt with, or indicate None.}

3.4.3.
Type

Networks
Date Needed Location(s)

Issues: {Describe any potential conflicts or other issues impacting network access/capacity/ availability, their impact, and how they will be dealt with, or indicate None.}

User Acceptance Test Plan Template Version 1.0 7/20/09

17

4. Testing Procedures
This section covers the procedures used to control the acceptance test. They include the areas of library control, problem reporting, change management, test execution control, and user acceptance. {For each of the following, reference may be made to existing documented procedures, giving version and date, and describing any exceptions for this project. Otherwise describe procedures in detail. Suggested text is included below.}

4.1. Library Procedures


The Software Migration procedure found in Section 2.06 of the UAT Life Cycle Version 2.0 will control the process for moving software components between libraries assigned to the defined test levels. Libraries will be physically controlled by the ChangeMan library management tool. The following libraries will be used: {Vary text as needed} Library Level) Unit Test Name(s) Description/Usage Unit Test Source Modules Unit Test Object Modules Unit Test Load Modules Integration Test Source Modules Integration Test Object Modules Integration Test Load Modules Acceptance Test Source Modules Acceptance Test Object Modules Acceptance Test Load Modules Production Source Modules Production Object Modules Production Load Modules

Integration Test

Acceptance Test

Production

4.2. Problem Reporting Procedures


The Requirements Validation and Defect Reporting procedure found in Section 2.05 of the UAT Life Cycle Version 2.0 will be used to document and control all problem reports and change requests during requirements validation. Requirements problem User Acceptance Test Plan Template Version 1.0 7/20/09 18

reports and change requests will be composed using the Requirements TR form and stored in the Lotus Notes Requirements Problem Report database. The Problem Reporting procedure found in Section 2.07 of the UAT Life Cycle Version 2.0 will be used to document and control problem reports and change requests during normal acceptance test execution. Testers will document all problem reports and change requests using the system TR form. This facility can be accessed directly from the test scripts in Notes.

4.3. Change Management Procedures


The Requirements/Design Change Management procedure found in Section 2.02 of the UAT Life Cycle Version 2.0 will be used to document and control all changes requested directly, or made in response to reported problems, during requirements validation and to manage the changes to documentation work products. Procedures developed by the Technology organization will be used to document and control all changes requested directly, or made in response to problem reports, during normal acceptance test execution and pilot testing, and to manage the changes to software work products.

4.4. Test Execution Procedures


The UAT Execution, Software Migration, and UAT Management procedures found in Section 2.07, 2.06 and 2.03, respectively, of the UAT Life Cycle Version 2.0 will be used to manage and control test execution, including promotion of builds or programs from integration testing, execution of the test cases, and demotion to unit testing of programs or builds for which test cases have failed to execute correctly. Test execution will continue until the software has been fully exercised or a decision is reached to demote the entire system back to the developers.

4.5. Certification Procedures


The system will be moved into production {all at one time|by builds listed above|in the following stages:} 1. {List/describe as appropriate.} Before the initial pilot rollout, the Certification procedure specified in Section 2.09 of the UAT Methodology will be followed.

User Acceptance Test Plan Template Version 1.0 7/20/09

19

5. Test Case Design


{Select one of the following two items.} Each of the test runs defined above consists of one or more test cases {as listed below| as listed in Attachment D}. The listing of test cases has been developed as the {name of report} produced by {tool or environment} on {workstation/server identification} from the input file {drive:\directory path\file}. [A hard copy listing is found in Attachment D.] {Hard copy should be attached unless access to the on-line material is available and familiar to all concerned.} Coverage of detail-level requirements by the test cases listed for each build is validated by the Detail-Level Validation Matrices or equivalent (see Section 2.8). The details of each test case, and associated test scripts, are stored in the {Name} Lotus Notes database. {Repeat the headings and table below as required, filling in details.}

5.1. Build {m}: {Name}


5.1.1. Test Run {n}
Objective(s)

Case No. Description

User Acceptance Test Plan Template Version 1.0 7/20/09

20

6. Test Case Execution


This section provides the remaining details needed to initiate test execution. These include the components to be tested and the test data components that will be accessed during the tests.

6.1. Components to be Tested


This section lists the software and non-software components required for the tests.

6.1.1.

Software Components

The following new or changed (N/C) software components will be tested. {Include all types of components, including JCL, DBMS coding, etc.} Build No.* Component Name N/C Description

{*May be left blank if software is delivered all at one time.}

6.1.2.

Execution Control Procedures

The following existing, new or changed (E/N/C) stored procedures will be used to install or remove software components or to control system execution. New or changed procedures will be tested as part of the acceptance test. Build No.* Procedure Name E/N/C Description

{*May be left blank if information is the same for all builds.}

6.1.3.

User Documentation

The following existing, new or changed (E/N/C) user procedures or other documentation will be referenced and applied during test execution. New or changed procedures or other documentation (non-software deliverables) will be tested as part of the acceptance test:

User Acceptance Test Plan Template Version 1.0 7/20/09

21

Build No.* Procedure Name

E/N/C

Description

{*May be left blank if the same procedures are used in all builds.}

6.1.4. Operations, Network and/or Administrative Procedures


The following existing, new or changed (E/N/C) procedures or other documentation will be referenced and applied during test execution by operations, network or other administrative staff (e.g., security administrators). New or changed procedures or other documentation (non-software deliverables) will be tested as part of the acceptance test: Build No.* Procedure Name E/N/C Description

{*May be left blank if the same procedures are used in all builds.}

6.2. Test Data Components


The following existing, new or changed (E/N/C) physical data components will be accessed by the system during test execution. New or changed components will be created or made available as described in the Build Strategy section above. Build No.* Component Name E/N/C Type** Description***

{*May be left blank if the same components are used in all builds.} {**E.g., flat file, VSAM file, database, database table - specify DBMS name where applicable.} {***Indicate if access by the build is Create, Read, Update or Delete (C/R/U/D) as well as any access by other systems or builds.}

User Acceptance Test Plan Template Version 1.0 7/20/09

22

Attachment A: Requirements Hierarchy

User Acceptance Test Plan Template Version 1.0 7/20/09

23

Attachment B: Requirements Validation Matrices

User Acceptance Test Plan Template Version 1.0 7/20/09

24

Attachment C: Work Plan

User Acceptance Test Plan Template Version 1.0 7/20/09

25

Attachment D: Test Case Design

User Acceptance Test Plan Template Version 1.0 7/20/09

26

Das könnte Ihnen auch gefallen