Sie sind auf Seite 1von 17

Department or Program Name

[SYSTEM/APPLICATION NAME]
Quality Assurance
Test Plan
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
Project QA/Tet P!an
Template Guidelines
To aid in the creation of a successfully completed Project QA/Test Plan, please review the
following guidelines. For additional instructions and information, please refer to the
University ervices Program !anagement "#ce. $emove these guidelines from the
completed document.
Purpose The Project QA/Test Plan is an um%rella plan that encompasses the entire
testing re&uired for a project. 't is the highest level testing plan that
documents the testing strategy for the project, descri%es the general
approach that will %e adopted in testing , provides the overall structure and
philosophy for any other re&uired testing documents, and de(nes what will
%e tested to assure the re&uirements of the product, service, or system )i.e.,
the delivera%les* are met.
+ased on the si,e, comple-ity, and type of project, subsequent test plans
may %e created and used during the testing process to verify that the
delivera%le wor.s correctly, is understanda%le, and is connected logically.
These test plans include the following categories and types. Testing
areas/coordinators would have templates speci(c to these plans.
Types of quality assurance planning include/
Process QA plans document how the project is meeting &uality and
compliance standards ensuring the right documentation e-ists to guide
project delivery and corrective actions, and
Requirements/application test plans document how the product,
service or system meets stated %usiness and technical re&uirements,
and how it will wor. within the de(ned operational/%usiness processes
and wor. 0ow.
Test plans typically used for development eforts include/
Unit Test Plans UT* documents what the programmer is coding for a
particular screen or unit.
!ntegrated "ystems Test Plan )!"T* documents how the system will
%e tested to assure major errors are (-ed and how end1to1end testing
%etween one1o2 applications will occur.
!ntegration Application Test Plans !AT* documents how
applications will test end1to1end all new functionality across all
applications.
User Acceptance Testing )UAT* documents how the users, who will
%e supporting the new and e-isting functionality in production, will test
this new functionality prior to production installation.
#perations !nterface Readiness Test Plan )#!R* documents how
the operations user will test all new and e-isting functionality.
Testing approaches that may %e used to test or validate projects that have
high ris. or impact include/
Regression Testing ensures all (-es identi(ed during 'AT and UAT
were made to the system and did not impair .ey e-isting functionality
(used mainly with new major software development projects).
$onversion !ntegration Application Testing is used to simulate the
(rst couple of processing days following a conversion.
Prototype is a techni&ue used either in the design, %uild, or testing
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
stage to construct a model of the product, service, or system to verify a
function, a design, how a particular module or program wor.s, etc.
Proof of $oncept is used to test an idea in a controlled environment
to test new features.
Alpha Testing tests a new product, service, or system %efore
implementation where it may have major impacts and the project team
wants to identify major pro%lems/%ugs %efore implementation )or goes
into production*.
%eta Testing di2ers from Alpha Testing in the amount of testing and
clean1up that needs to %e performed. The project should %e ready for
implementation.
Pilot is a limited use of the product, service, or system %y a remote
site, su%set of the customer %ase, etc. This testing techni&ue is used to
wor. out logistical pro%lems, validate the delivera%le, and minimi,e
impact %y limited rollout.
Parallel Testing occurs when the old and the new product, service, or
system are running simultaneously to allow the project customer to
chec. that the delivera%le is wor.ing per speci(cations.
"tress/&oad Testing ensures the system will perform relia%ly during
full production and heavy wor.loads.
#perational/%usiness Readiness Testing wal.s through the
operational/%usiness processes and wor.0ows to ensure that
procedures, documentation, reconciliation, training, and wor. 0ows are
complete and correct.
#'nership The Project !anager is responsi%le for ensuring that all testing plans are
created and identi(es them under one um%rella plan )i.e., the !aster
Project QA/Test Plan*. The project testing team lead)s* develop the
necessary su%se&uent test plans.
(hen
Phase/
3esign
tage/
Planning
The Project QA/Test Plan is completed during the 3esign phase of the
olution 3elivery 4ife 5ycle. 't should %e updated anytime additional
information or project changes a2ect its content.
't is a re&uired delivera%le on 4arge and !edium projects, and a %est
practice on Fast Trac. projects. For additional guidance, the Project
5lassi(cation 6or.sheet is availa%le.
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
Template
$ompletion
7ote/ Te-t
within ) *
%rac.ets need
to %e replaced
with project1
speci(c
information.
8. 3o not include the Template 9uidelines in your (nal document. :nter
the project information in the page header and footer, title page, and
document contri%utors and version control.
;. 5omplete the document utili,ing suggested te-t where applica%le and
entering te-t/(elds where shown within <%lue te-t= %rac.ets. 7ote
that the %lue te-t is 7"T to %e included in your (nal document. 'ts
purpose is to either provide guidance for completing the document, or
to show where te-t/(elds are to %e input.
>. "nce changes are made to your document and you?re ready to (nali,e,
ensure that you update your Ta%le of 5ontents )T"5* section. To
Update the T"5/ 'f you are unsure how to do this, place your mouse
arrow to the left of the (rst entry in the Ta%le of 5ontents section and
clic. the left %utton once. "nce the entire section is highlighted, move
the mouse arrow anywhere within the highlighted section and clic. the
right %utton once. 'n the drop1down menu, choose Update Field and
Page 7um%ers "nly or :ntire Field as needed. 7ote that you might
need to repeat the aforementioned steps to change the font %ac. to
Tahoma 8@ pt.
A. The !aster Test Plan is to %e retained with other project1related
documentation and maintained in accordance with the %usiness line?s
records retention policy.
+mpo'erme
nt ,
"calability
This template is provided as a guideline to follow in producing the
minimum %asic information needed to successfully complete a Project
QA/Test Plan in meeting P!" guidelines and illustrates the art and science
of project management. Project !anagers are empowered to use this
template as needed to address any speci(c re&uirements of the proposed
project at hand. The amount of detail included in the template will depend
on the si,e and comple-ity of the project.
!mportant
-otice
As this template may change, it is highly recommended that you access a
%lan. template from the Program !anagement "#ce we%site
)http///www.uservices.umn.edu/pmo/* each time you need one for a new
project and not merely use one from a previous project %y changing the old
te-t.
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
DOCUMENT IN"ORMATION AND APPRO#ALS
.+R"!#- /!"T#R0
#er$on % Date Re&$e' () Reaon *or c+ange
,-. //01/,0 Aaron Demenge PMO Re&$e2
1#$U2+-T APPR#.A&"
Appro&er Name Project Ro!e S$gnat3re/E!ectron$c Appro&a! Date
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
TA(LE O" CONTENTS
Intro'3ct$on----------------------------------------------------------------------------------------------------------------------,
Scope...................................................................................................................................................... 1
Test Objectives........................................................................................................................................ 1
Testing Goals........................................................................................................................................... 1
Tet Met+o'o!og)------------------------------------------------------------------------------------------------------------0
Entrance Criteria..................................................................................................................................... 2
Exit Criteria.............................................................................................................................................. 2
Test Execution......................................................................................................................................... 2
Test Scenarios......................................................................................................................................... 3
Test Case/Script evelopment................................................................................................................ !
e"ect #eporting..................................................................................................................................... !
Tet En&$ronment-------------------------------------------------------------------------------------------------------------4
So"t$are #e%uirements........................................................................................................................... &
'ard$are #e%uirements.......................................................................................................................... &
Testing (lat"orm....................................................................................................................................... &
Uer Acceptance Tet P!an---------------------------------------------------------------------------------------------4
e"inition................................................................................................................................................. &
Testing #e%uirements.............................................................................................................................. )
Testers/(articipants................................................................................................................................. )
Testing Sc*edule..................................................................................................................................... )
A3mpt$on an' R$5--------------------------------------------------------------------------------------------------1
+ssumptions............................................................................................................................................ ,
#is-s....................................................................................................................................................... ,
6o/No7go Meet$ng------------------------------------------------------------------------------------------------------------1
A''$t$ona! Project Doc3ment---------------------------------------------------------------------------------------8
Ro!e an' Repon$9$!$t$e---------------------------------------------------------------------------------------------8
S$gn7o** an' Ac5no2!e'gement-------------------------------------------------------------------------------------:
Tet D$rector ; De*ect Trac5$ng Proce---------------------------------------------------------------------,,
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
INTRODUCTION
Scope
The overall purpose of testing is to ensure the Bname of applicationC application meets all of
its technical, functional and %usiness re&uirements. The purpose of this document is to
descri%e the overall test plan and strategy for testing the Bname of applicationC application.
The approach descri%ed in this document provides the framewor. for all testing related to
this application. 'ndividual test cases will %e written for each version of the application that
is released. This document will also %e updated as re&uired for each release.
Tet O9ject$&e
The &uality o%jectives of testing the Bname of applicationC application are to ensure
complete validation of the %usiness and software re&uirements/
Derify software re&uirements are complete and accurate
Perform detailed test planning
'dentify testing standards and procedures that will %e used on the project
Prepare and document test scenarios and test cases
$egression testing to validate that unchanged functionality has not %een a2ected %y
changes
!anage defect trac.ing process
Provide test metrics/testing summary reports
:nsure the application is certi(ed for release into the University of !innesota
production environment
chedule 9o/7o 9o meeting
$e&uire sign1o2s from all sta.eholders
Tet$ng 6oa!
The goals in testing this application include validating the &uality, usa%ility, relia%ility and
performance of the application. Testing will %e performed from a %lac.1%o- approach, not
%ased on any .nowledge of internal design or code. Tests will %e designed around
re&uirements and functionality.
Another goal is to ma.e the tests repeata%le for use in regression testing during the project
lifecycle, and for future application upgrades. A part of the approach in testing will %e to
initially perform a Emo.e Test? upon delivery of the application for testing. mo.e Testing is
typically an initial testing e2ort to determine if a new software version is performing well
enough to accept it for a major testing e2ort. For e-ample, if the new software is crashing
fre&uently, or corrupting data%ases, the software is not in a sta%le enough condition to
warrant further testing in its current state. This testing will %e performed (rst. After
acceptance of the %uild delivered for system testing, functions will %e tested %ased upon the
designated priority )critical, high, medium, low*.
Quality
Quality software is reasona%ly %ug1free, meets re&uirements and/or e-pectations, and is
maintaina%le. Testing the &uality of the application will %e a two1step process of independent
veri(cation and validation. First, a veri(cation process will %e underta.en involving reviews
and meetings to evaluate documents, plans, re&uirements, and speci(cations to ensure that
the end result of the application is testa%le, and that re&uirements are covered. The overall
goal is to ensure that the re&uirements are clear, complete, detailed, cohesive, attaina%le,
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age 1
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
and testa%le. 'n addition, this helps to ensure that re&uirements are agreed to %y all
sta.eholders.
econd, actual testing will %e performed to ensure that the re&uirements are met. The
standard %y which the application meets &uality e-pectations will %e %ased upon the
re&uirements test matri-, use cases and test cases to ensure test case coverage of the
re&uirements. This testing process will also help to ensure the utility of the application F i.e.,
the design?s functionality and Gdoes the application do what the users needHI
Reliability
$elia%ility is %oth the consistency and repeata%ility of the application. A large part of testing
an application involves validating its relia%ility in its functions, data, and system availa%ility.
To ensure relia%ility, the test approach will include positive and negative )%rea.1it* functional
tests. 'n addition, to ensure relia%ility throughout the iterative software development cycle,
regression tests will %e performed on all iterations of the application.
PRO<ECT QUALITY ASSURANCE
All project artifacts are posted to the :P! project harePoint site, located at/ Benter
harePoint site U$4C
"at Trac5 Project Re=3$re' Doc3ment
Project Artifacts $omplete
Project Proposal in :P!
'nitiation Phase 5hec.list
Project 6+
Project 5harter
+usiness $e&uirements 3ocument
Project Plan $eview 5hec.list
Analysis Phase 5hec.list
3esign $eview 5hec.list
5onceptual 'T Architecture $eview 5hec.list
Application Architecture 3esign
ystem Architecture 3esign
5ode $eview 5hec.list
'mplementation Plan 5hec.list
QA Test Plan
Test Planning 5hec.list
3eployment $eadiness Assessment 5hec.list
User Acceptance ign "2
ervice 4evel Agreement and 5hec.list
4essons 4earned
5lose out $eport
TEST MET>ODOLO6Y
Entrance Cr$ter$a
All %usiness re&uirements are documented and approved %y the %usiness
users.
All design speci(cations have %een reviewed and approved.
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age 2
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
Unit testing has %een completed %y the development team, including vendors.
All hardware needed for the test environment is availa%le.
The application delivered to the test environment is of relia%le &uality.
'nitial smo.e test of the delivered functionality is approved %y the testing
team.
5ode changes made to the test site will go through a change control process.
E?$t Cr$ter$a
All test scenarios have %een completed successfully.
All issues prioriti,ed and priority 8 issues resolved.
All outstanding defects are documented in a test summary with a priority and
severity status.
9o/7o1go meeting is held to determine accepta%ility of product.
Tet E?ec3t$on
The test e-ecution phase is the process of running test cases against the software %uild to
verify that the actual results meet the e-pected results. 3efects discovered during the
testing cycle shall %e entered into the project harePoint Team ite 3efect list or Quality
5enter )o2ered %y "'T*. "nce a defect is (-ed %y a developer, the (-ed code shall %e
incorporated into the application and regression tested.
These following testing phases shall %e completed )if applica%le*/
Unit Testing
Unit testing is performed %y the report developers at U ervices 'T and "'T in their
development environment. The developers .now and will %e testing the internal logical
structure of each software component. A description of the unit testing should %e provided
to the project team.
Functional Testing
Functional testing focuses on the functional re&uirements of the software and is performed
to con(rm that the application operates accurately according to the documented
speci(cations and re&uirements, and to ensure that interfaces to e-ternal systems are
properly wor.ing.
Regression Testing
$egression testing shall %e performed to verify that previously tested features and functions
do not have any new defects introduced, while correcting other pro%lems or adding and
modifying other features.
Integration Testing
'ntegration testing is the phase of software testing in which individual software modules are
com%ined and tested as a group. 'n its simplest form, two units that have already %een
tested are com%ined into a component and the interface %etween them is tested. 'n a
realistic scenario, many units are com%ined into components, which are in turn aggregated
into even larger parts of the program. The idea is to test com%inations of pieces and
eventually e-pand the process to test your modules with those of other groups. :ventually
all the modules ma.ing up a process are tested together.
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age 3
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
Interface Testing
This testing follows a transaction through all of the product processes that interact with it
and tests the product in its entirety. 'nterface testing shall %e performed to ensure that the
product actually wor.s in the way a typical user would interact with it.
Destructive Testing
3estructive testing focuses on the error detection and error prevention areas of the product.
This testing is e-ercised in an attempt to anticipate conditions where a user may encounter
errors. 3estructive testing is less structured than other testing phases and is determined %y
individual testers.
User acceptance testing
User acceptance testing activities will %e performed %y the %usiness users. The purpose of
this testing will %e to ensure the application meets the users? e-pectations. This also
includes focuses on usa%ility and will includeJ appearance, consistency of controls,
consistency of (eld naming, accuracy of drop down (eld information lists, spelling of all (eld
name/data values, accuracy of default (eld values, ta% se&uence, and error/help messaging
Tet Scenar$o
+elow are the high1level scenarios that will %e tested. These scenarios are derived from the
$e&uirements !atri- and Use 5ases. From these, detailed test scripts will %e created.
!
1

3

-
u
m
b
e
r
Test "cenario 1escription
T
e
s
t

"
c
r
i
p
t
R
e
f
e
r
e
n
c
e
T
e
s
t
i
n
g
$
o
m
p
l
e
t
e
4
A&& "TAT+1 R+QU!R+2+-T" +5!"T A-1 6U-$T!#-
@@8
BA test scenario is almost li.e a story Ka user enters into the
application from login window %y entering valid user name and
password. After entering he will clic. on module Payslip and clic.s
on latest payslip feature to view his latest payslipK. Any test
scenario will contain a speci(c goal.C
@@;
"+$UR!T0
@@> BAdd description of re&uirements.C
@@A
1ATA .A&!1AT!#-
@@L BAdd description of re&uirements.C
@@M
+-.!R#-2+-T
@@N BAdd description of re&uirements.C
@@O
!-T+R6A$+"
@@P BAdd description of re&uirements.C
@8@
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age !
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
Tet Scr$pt De&e!opment
Test script design is the central focus of a software &uality assurance process. A test script is
de(ned as a written speci(cation descri%ing how a single or group of %usiness or system
re&uirement)s* will %e tested. The test script consists of a set of actions to %e performed,
data to %e used, and the e-pected results of the test. The actual results of the test are
recorded during test e-ecution. Test scripts will also %e updated as testing proceeds.
Test cripts written for this project include the following/
Test cript '3
Test 5ases veri(ed
$e&uirements veri(ed
Purpose of test
Any dependencies and/or special set1up instructions re&uired for performing
the test
Test description and steps
:-pected results
De*ect Report$ng
'ssues/defects are trac.ed for resolution with the following guidelines/
'ssues will %e reported %ased upon documented re&uirements.
'ssues will %e trac.ed %y the testing team, reported and entered into Quality
5enter.
'ssues will %e (-ed %y the development team %ased on the priority/severity
assigned %y the test lead.
All critical/priority 8 defects will %e (-ed %efore release to production.
ee the 3efect Trac.ing Process at the end of this document for detailed instructions on how
to log and trac. defects in Quality 5enter.
TEST EN#IRONMENT
Re=3$rement
5lient erver Technical $e&uirements/
!i-ed %rowsers supported )'nternet :-plorer, Firefo-, !o,illa*
"racle 3ata%ase
5lient Platform/ P5 and !acintosh
Production server location/
Testing Platform
3es.top P5 F the application supports all A19rade %rowsers for 6indows and
!ac operating systems , as de(ned %y QahooR?s 9raded +rowser upport standards.
http///developer.yahoo.com/yui/articles/g%s/ 6indows ;@@@/':M may %e e-cluded.
Test server location/
USER ACCEPTANCE TEST PLAN
De*$n$t$on
The overall purpose of testing is to ensure the Bname of applicationC application performs at
an accepta%le level for the customer. This section outlines the detailed plan for user
acceptance testing of this application.
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age &
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
This test plan will %e used to record the customer?s sign o2 of the documented scenarios.
3etailed test scripts/cases have %een developed and will %e used to record the results of
user testing. This document is a high level guide, and is not intended as a replacement for
any speci(c user acceptance testing procedures that individual areas might have.
Tet$ng Re=3$rement
Testing will ta.e place in Binsert locationC. ome testers may choose to
perform some testing from their regular wor.stations where it is possi%le. Test results
must still %e coordinated with others.
UAT will ta.e place %eginning on Binsert dateC.
'denti(ed testing participants will receive instructions prior to the start of
testing.
'denti(ed testing participants will perform the e&uivalent of their normal
%usiness function in the upgraded environment.
Test scripts/cases and scenarios will %e prepared prior to the start of UAT.
Test participants will conduct the tests and document results.
3efects will %e entered into Test 3irector and trac.ed %y the Test 4ead.
Teter/Part$c$pant
Testing participants should include representatives from all areas involved in the application.
There are %ene(ts to including representatives from across all areas to validate the systems
functions %efore the upgrade goes live in production.
The %est candidates for UAT are/
ta2 directly impacted %y the upcoming system and %usiness process
changes.
Fre&uent users of the application and functions planned in test scripts/cases.
'ndividuals with a sound understanding of %usiness processes in the areas
they represent.
'ndividuals with the necessary time to commit to this endeavor.
6illing to e-periment )to try various methods to see what wor.s and what
doesn?t wor.*.
Patient and have a tolerance for am%iguity.
Teter Name Department/Area Repreent$ng Area o* Tet$ng "oc3
Tet$ng Sc+e'3!e
All upgraded functionality and test data will %e migrated to the test environment prior to the
start of user acceptance testing.
Act$&$t) Lea' Repon$9$!$t) Date
'dentify and select testers for UAT
3evelop test scenarios and scripts/cases
Dalidate participants availa%ility for testing
$eview scenarios/scripts for accuracy,
completeness and se&uence )con(rm test
data is correct*
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age )
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
:nsure UAT 4a% des.tops con(gured for
testing
UAT environment validation
Testing %y UAT participants
ASSUMPTIONS AND RIS@S
A3mpt$on
The +usiness team has reviewed and accepted functionality identi(ed in the
%usiness re&uirements and software re&uirements documents.
Project change control process in place to manage re&uirements.
5ode wal.throughs/reviews will %e completed %y the development team.
Unit testing will %e completed %y the development team prior to release to the
test team.
Testers will test what is documented in the re&uirements.
The test team will have a separate test environment to perform testing.
All changes to re&uirements will %e communicated to the test team.
$esources identi(ed in this plan are availa%le to test the application and
resolve defects and address issues as they are raised %y the test team.
That the delivery of the product to production contains all setup, etc., that is
necessary for optimum performance in the production site.
Project sponsors, %usiness and technical, will provide actiona%le guidance on
defect prioriti,ation and resolution.
The UAT environment will %e availa%le and des.tops will %e availa%le to
perform testing.
R$5
cope creep )last minute addition of new re&uirements* impacts deadlines for
development team and test team.
Aggressive target date increases the ris. of defects %eing migrated to
production. 'f development timelines are not met, this will directly impact the testing
timelines.
Sey resources have completing priorities ma.ing availa%ility less than
scheduled.
Any downtime of the test system will signi(cantly impact the testing cycle.
4oad testing is not %eing completed on a consistent %asisJ true performance of
the application may not %e .nown until release to production.
6O/NO76O MEETIN6
"nce the test team has completed the test cycle, a 9o/ 7o1go meeting is scheduled as part
of the implementation planning under launch readiness. This meeting is attended %y the
project manager, %usiness team, test lead, technical lead, and any other sta.eholders.
The test lead will provide a testing summary and list all outstanding unresolved defects and
any associated ris.s with releasing the product to production. All outstanding issues are
discussed at that time %efore a decision is made to push to production. A written sign1o2
form is signed %y all team mem%ers as listed a%ove. The list of outstanding issues is also
attached to the sign1o2 form.
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age ,
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
ROLES AND RESPONSI(ILITIES
Reo3rce T)pe Repon$9$!$t$e Name
"ponsor Provides 9o/7o 9o
authori,ation that product is ready for
release as part of implementation
planning and launch process
Prioriti,es issues and defects,
and manage technical resources
!a.es decisions on
unresolved issues
Project 2anager Provides guidance on the
overall project
5oordinates and develops
project schedule
4iaison with %usiness to
ensure participation and ownership
Trac.s all project activities
and resources, ensuring project remains
within scope
Facilitates identifying and
%ringing closure to open issues
5ommunicates project status
"ubject 2atter
+7perts
3e(ne %usiness re&uirements
and e-pected results for %usiness
acceptance
:-ecute user acceptance
testing
1ev Team &ead 3esign application
architecture
5reate technical design
3ata%ase Administrator
1evelopers 6rite application code
$esolve defects
upport testers
%usiness &ead 6rite %usiness re&uirements,
test plan and test cases
!aintain re&uirements and
defect reporting in Test 3irector
4ead testing cycle
QA &ead !aintain project in Test
3irector
6rite test plan to include test
scenarios and cases
Facilitate testing
!aintain and manage defects
in Test 3irector
%usiness
Analyst
6rite %usiness re&uirements
and %uild test scripts
!aintain re&uirements in Test
3irector
4ead testing cycle and
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age 5
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
coordinate test environment
Testers Perform user acceptance
testing
SI6N7O"" AND AC@NOALED6EMENT
' understand that %y agreeing to participate in this testing through the e-ecution of the
testing plan, ' approve of the activities de(ned and authori,e my department to participate
as documented for the successful implementation of this application in our department.
TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT 3ate/ TTT/TTT/TTT
$esource 7ame
Title or $esponsi%ility
TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT 3ate/ TTT/TTT/TTT
$esource 7ame
Title or $esponsi%ility
TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT 3ate/ TTT/TTT/TTT
$esource 7ame
Title or $esponsi%ility
TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT 3ate/ TTT/TTT/TTT
$esource 7ame
Title or $esponsi%ility
TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT 3ate/ TTT/TTT/TTT
$esource 7ame
Title or $esponsi%ility
TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT 3ate/ TTT/TTT/TTT
$esource 7ame
Title or $esponsi%ility
TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT 3ate/ TTT/TTT/TTT
$esource 7ame
Title or $esponsi%ility
TTTTTTTTTTTTTTTTTTTTTTTTTTTTTTT 3ate/ TTT/TTT/TTT
$esource 7ame
Title or $esponsi%ility
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age 6
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
TEST DIRECTOR ; DE"ECT TRAC@IN6 PROCESS
S3mmar)B
Screen name and s*ort description about t*e de"ect being reported4 usuall1
providing -e1 $ords $it* $*ic* to identi"1 and/or searc* "or t*e de"ect.
Detecte' ()B +uto populates $it* t*e 0ser 7 o" person logged in.
Detecte' on DateB +uto populates $it* current date.
Se&er$t)B
escribes t*e degree o" impact t*at a de"ect *as on t*e operation o" t*e
application.
A$gne' ToB 7ndividual being assigned t*e de"ect "or "ixing.
Detecte' $n (3$!'B
8uild 7 in $*ic* t*e de"ect $as "ound. 8uild 7 is an identi"ier "or t*e code
release4 assigned b1 9eb evelopment.
"$?e' $n (3$!'B
8uild 7 in $*ic* t*e de"ect is "ixed. 8uild 7 is an identi"ier "or t*e code
release4 assigned b1 9eb evelopment.
Pr$or$t)B
T*is "ield describes t*e impact t*e de"ect *as on t*e $or- in progress and t*e
importance and order in $*ic* a bug s*ould be "ixed.
Stat3B 7ndicates t*e existing state o" a de"ect4 auto populates $it* a de"ault o" :;e$<
Decr$pt$onB
Enter description o" de"ect
+dd individual steps to reproduce. 7nclude all steps and screens t*at $ere
accessed.
Enter exact $ords o" t*e error message.
Ema$! De*ectB
+"ter entering de"ect4 rig*t=clic- on it and select email to send to assigned
developer.
De*ect reo!3t$on
proceB
9*en t*e de"ect is opened4 it is assigned to t*e appropriate person4 status is
c*anged to :+ssigned<.
Once t*e de"ect is "ixed>
1. T*e developer to $*om t*e de"ect is assigned $ill update t*e de"ect
comments to document t*e "ix t*at $as made. 0ser 7 and ate is
automaticall1 added to t*e de"ect b1 clic-ing on :+dd Comment<.
2. T*e developer to $*om t*e de"ect is assigned $ill c*ange t*e status to
:?ixed<4 and $ill c*ange t*e :+ssigned To< "ield to t*e tester or de"ect
manager.
3. T*e tester $ill retest t*e submitted de"ect.
!. 7" de"ect passes t*e retest4 t*e tester or de"ect manager $ill c*ange
Status to :Closed<.
&. 7" t*e de"ect is not "ixed4 t*e tester $ill c*ange t*e Status to :+ssigned<
and enter t*e 0ser7 o" t*e developer in t*e +ssigned To "ield.
). Once t*e de"ect *as been veri"ied as "ixed4 t*e project manager @or
de"ect managerA $ill update t*e status to :Closed<.
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age 1/
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
DE"INITIONS "OR DE"ECT PRIORITY AND SE#ERITY
PRIORITYB T*is "ield describes t*e impact t*e de"ect *as on t*e $or- in progress and t*e importance
and order in $*ic* a bug s*ould be "ixed. T*is "ield is utiliBed b1 t*e developers and test engineers to
prioritiBe $or- e""ort on t*e de"ect resolution.
, ; Urgent (!oc5
Aor5
?urt*er development and/or testing cannot occur until t*e de"ect *as been
resolved.
0 ; Reo!&e ASAP
T*e de"ect must be resolved as soon as possible because it is impairing
development and/or testing activities.
C ; Norma! Q3e3e
T*e de"ect s*ould be resolved in t*e normal prioritiBation and completion o"
de"ect resolution.
/ ; Lo2 Pr$or$t)
T*e de"ect is an anno1ance and s*ould be resolved4 but it can $ait until a"ter
more serious de"ects *ave been "ixed.
4 ; Tr$&$a! T*e de"ect *as little or no impact to development and/or testing $or-.
SE#ERITYB T*is "ield describes t*e degree o" impact t*at a de"ect *as on t*e operation o" t*e
application.
, ; Cr$t$ca!
Critical loss o" "unction. T*e de"ect results in s1stem cras*es4 t*e "ailure o" a
-e1 subs1stem or module4 a corruption or loss o" data4 or a severe memor1
lea-.
0 ; Major
2ajor loss o" "unction. T*e de"ect results in a "ailure o" t*e s1stem4 subs1stem4
or module4 but t*e de"ect does not result in t*e corruption or loss o" signi"icant
data.
C ; Mo'erate
2oderate loss o" "unction. T*e de"ect does not result in a "ailure o" t*e s1stem4
subs1stem4 or module4 but t*e de"ect ma1 cause t*e s1stem to displa1 data
incorrectl14 incompletel14 or inconsistentl1.
/ ; M$nor
2inor loss o" "unction4 or anot*er problem $*ere a $or-around is present.
T*ere are no data integrit1 issues.
4 ; Ua9$!$t)
T*e de"ect is related to t*e s1stem usabilit14 is t*e result o" non=con"ormance
to a standard4 or is related to t*e aest*etics o" t*e s1stem. T*ere is no loss o"
s1stem "unction.
D ; En+ancement
T*e de"ect is a re%uest "or an en*ancement4 i.e. it is not $it*in t*e scope o" t*e
current project e""ort.
. 2/12 #egents o" t*e 0niversit1 o" 2innesota. +ll rig*ts reserved. #evised 3ul1 1!4 2/1! (age 11