Sie sind auf Seite 1von 15

test design page 1 of 15

brought to you by software planner (www.softwareplanner.com). deliver software solutions to


specification, on time and on budget with software planner. tracks customer requirements, test
cases, defects, project deliverables, and your schedule (appointments and to-do lists) via the
web. discussion forums and contact management also built in. free 2 week trial.

software planner guided tour pricing brochure free trial software planner details

test design

owners and list of contacts


name email phone role
john doe jdoe@me.com 303-894-7315 w project manager
303-471-8344 h development lead
303-203-5567 pgr
joe tester system test lead
jane prodsupport production support mgr
joe usermgr user test lead
joe developer developer – presentation tier
jane developer developer – business tier
joe dba data base administrator
joe tester tester
jane tester tester
joe customer department vp
jane customer department mgr
josey customer product support

signoffs
phase name date signature
test design john doe, pm/dm xx/xx/xx

joe tester, system test lead

jane prodsupport,
production support mgr

joe user mgr, um

joe customer, customer

revision history

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 2 of 15

date reason for change(s) author(


s)

09/15/198 first draft john doe


8

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 3 of 15

table of contents
test design................................................................................................................ ........................
owners and list of contacts............................................................................................................... .
signoffs.................................................................................................................... .........................
revision history.......................................................................................................................... ........
1. introduction............................................................................................................................. ...
1.1 background................................................................................................................ ..........
1.2 references........................................................................................................................ ....
1.3 code freeze date...................................................................................................... ............
1.4 change control.................................................................................................... .................
2. items to be tested........................................................................................................... ...........
2.1 level of testing................................................................................................................ ......
2.2 features to be tested............................................................................................... .............
2.3 test case matrix................................................................................................................... .
2.4 features excluded from testing....................................................................................... ......
3. testing approach................................................................................................................... .....
3.1 test deliverables............................................................................................................. ......
3.2 defect tracker setup.......................................................................................... ...................
4. release criteria....................................................................................................................... ....
4.1 test case pass/fail criteria.......................................................................... ..........................
4.2 suspension criteria for failed smoke test..................................................................... .........
4.3 resumption requirements...................................................................................... ...............
4.4 release to user acceptance test criteria........................................................... ....................
4.5 release to production criteria.................................................................... ...........................
5. hardware................................................................................................................ ...................
5.1 servers........................................................................................................................ .........
5.2 server configuration:......................................................................................... ...................
5.3 other servers:.................................................................................................................. .....
5.4 clients:........................................................................................................ .........................
6. project plan............................................................................................................................. ...
6.1 critical dates.................................................................................................................... .....
6.2 budget information................................................................................................... ............

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 4 of 15

1. introduction
the primary goal of this document is to establish a plan for the activities that will verify
[product name] as a high quality product that meets the needs of the [product name] business
community. these activities will focus upon identifying the following:

• items to be tested
• testing approach
• roles and responsibilities
• release criteria
• hardware
• project plan and budget
• risks and contingencies

1.1 background
write one or two sentences that describe the system to be tested.

example:
the defect tracker system is a sophisticated bug-tracking tool that allows clients to
significantly increase the quality of their software deliverables. this is a new product, so
no backwards compatibility is necessary.

1.2 references
list all reference material you used in creating this plan.

example:
1. functional specification, program management, xxx 1999
2. testing computer software, second edition, kaner / falk / nguyen, 1993
3. detailed design, program management, xxx 1999

1.3 code freeze date


production code for [product name] will be frozen on mm/dd/yy. our assumption is that
any production code changes made after that date is outside of the responsibility of this
development project.

1.4 change control


after baseline, all changes must be approved and documented by the change control
board. if it is agreed that the change is necessary, the impact to development and testing
must be agreed upon by the test lead, development lead and project manager. this may
(or may not) affect the planned completion date of the project.

2. items to be tested

2.1 level of testing


below is a list of services that testing may provide. next to each service is the degree
of testing that we will perform. below are the valid level desired:

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 5 of 15

high – high risk area, test this area very hard


medium – standard testing
low – low risk area, test if time allows
none – no testing desired

level service
desired
performance testing:
performance testing is testing to ensure that the application responds
in the time limit set by the user. if this is needed, the client must supply
the benchmarks to measure by and we must have a hardware
environment that mirrors production.
windows / internet gui standards:
this testing is used to ensure that the application has a standardized
look and feel. it may be as simple as ensuring that accelerator keys
work properly and font type and size are consistent or it could be as
exhaustive as ensuring that the application could be assigned a
windows logo if submitted for one (there are strict guidelines for this).
note: if this level of testing is needed, the client must provide their
standards as to allow us to compare to that standard. there is a good
book that explains microsoft standards: the interface guidelines for
software design by microsoft press.
platform testing:
platform testing is used to warrant that the application will run on
multiple platforms (win 95/98, win nt, ie 4.0, ie 5.0, netscape, etc.)
(specify which ones)
localization:
localization testing is done to guarantee that the application will work
properly in different languages (i.e. win 95/98 english, german,
spanish, etc.) this also involves ensuring that dates will work in
dd/mm/yy format for the uk. (specify which ones)
stress testing:
stress testing is testing to ensure that the application will respond
appropriately with many users and activities happening simultaneously.
if this is needed, the number of users must be agreed upon
beforehand and the hardware environment for system test must mirror
production.
conversion:
conversion testing is used to test any data that must be converted to
ensure the application will work properly. this could be conversion
from a legacy system or changes needed for the new schema.
parallel testing:
parallel testing is used to test the functionality of the updated system
with the functionality of the existing system. this is sometimes used to
ensure that the changes did not corrupt existing functionality.
regression of unchanged functionality:
if regression must occur for functional areas that are not being
changed, specify the functional areas to regress and the level of
regression needed (positive only or positive and negative testing).
automated testing:
automated testing can be used to automate regression and functional
testing. this can be very helpful if the system is stable and not
changed often. if the application is a new development project,
automated testing generally does not pay big dividends.
installation testing:

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 6 of 15

installation testing is testing the setup routine to ensure that the


product can be installed fresh, over an existing copy, and with other
products. this will test different versions of ocx’s and dll’s.
end to end / interface testing:
end to end testing is testing all inputs (super-systems) and outputs
(sub-systems) along with the application. a controlled set of
transactions is used and the test data is published prior to the test
along with the expected results. this testing ensures that the
application will interact properly with the other systems.
usability:
usability is testing to ensure that the application is easy to work with,
limits keystrokes, and is easy to understand. the best way to perform
this testing is to bring in experienced, medium and novice users and
solicit their input on the usability of the application.
user’s guide / training guides:
this testing is done to ensure that the user, help and training guides are
accurate and easy to use.
guerilla testing:
guerilla testing is done to use the system with unstructured scenarios
to ensure that it responds appropriately. to accomplish this, you may
ask someone to perform a function without telling them the steps for
doing it.
security testing:
security testing is performed to guarantee that only users with the
appropriate authority are able to use the applicable features of the
system.
network testing:
network testing is done to determine what happens when different
network latency is applied when using the application. it can uncover
possible problems with slow network links, etc.
hardware testing:
hardware testing involves testing with a bad disk drive, faulty network
cards, etc. if this type of testing is desired, be very specific about what
is in scope for this testing.
duplicate instances of application:
this testing is done to determine if bringing up multiple copies of the
same application will cause blocking or other problems.
year 2000:
this testing is performed to ensure that the application will work in the
year 2000 and beyond.
temporal testing:
temporal testing is done to guarantee that date-centric problems do not
occur with the application. for example, if many bills are created
quarterly, you may want to set the server date to a quarter-end to test
this date-centric event.
disaster recovery (backup / restore):
this testing is done to aid production support in ensuring that adequate
procedures are in place for restoring upon a disaster.
input and boundardy tests:
testing designed to guarantee that the system would only allow valid
input. this includes testing to ensure that the maximum number of
characters for a field may not be exceeded, boundary conditions such
as valid ranges and “off-by-one”, “null”, “max”, “min”, tab order from
field to field on the screen, etc.
out of memory tests:
testing designed to ensure that the application would run in the amount
_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 7 of 15

of memory specified in the technical documentation. this testing will


also detect memory leaks associated with starting and stopping the
application many times.

2.2 features to be tested


below is a list of features that will be tested. note: if features are being dropped into
production at different times, add a couple of columns to this grid that identifies
what features will be in each drop. the items below map back to the functional
specifications.

business ref. no. feature functional specification


requirements
page 5.2.1 link to inquiry page access to this page will come
initialization and from a link off the resource
defaults menu on the select page

5.2.1 page user impersonation will not be


recognized.

5.2.2.1 user name use the logon id to retrieve the


user’s first and last name
5.2.2.2 user email alias use the logon id to retrieve the
user’s email address
5.2.2.3 user company use the logon id to retrieve the
user’s company name
5.2.2.4 country dropdown this combo box will be
populated from the country
table
5.2.2.5 question type dropdown this combo box will be
populated from a hard coded
list
5.2.2.6 question sub type this combo box will be
dropdown populated from a hard coded
list
5.2.3.1 reference fields the reference fields will default
to blank.
5.2.3.2 subject field the subject field will default to
blank
5.2.3.3 all combo boxes all combo box or drop down
controls will be set to
unselected

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 8 of 15

2.3 test case matrix

test case id func test case expected result


spec description
id
group: navigation / security
101 400- display inquiry page from inquiry page displayed
group resource menu on the
5.2.1 select page
102 401 – attempt to display the inquiry page not displayed because
group inquiry page without unable to display resource menu
5.2.1 permission
group: user interface
103 none tab through inquiry page tab from top to bottom and left to right
104 none try all browser functions all functions operate in a standard way
105 none check all labels for all labels displayed according to screen
spelling, alignment example.
402 – check selectability of all cb and drop down controls are set to
group combo boxes (cb) and unselected
5.2.3.3 drop down controls
group: default values
106 409 – valid user name associated user name displayed (see user table)
group with logon id
5.2.2.1
107 409 – valid email address user email address displayed (see user
group associated with logon id table)
5.2.2.2
108 409- no user email address display error message “you must have a
group associated with logon id email address before the request can be
5.7.2.1 processed. “

109 409 – valid company name company name displayed (company table
group associated with logon id using a join from the user)
5.2.2.3
110 409 – no company name display error message “your company is
group associated with logon id not registered on [product name] and use
5.2.4 of the inquiry form is not allowed.
group: boundary tests
111 none display a valid user id user name displayed
(minimum char. and
maximum char) associated
with a logon id

2.4 features excluded from testing


below is a list of features that will not be tested.

description of excluded item reason excluded


unit testing or “white box” testing development is responsible for this. we test it
using a “black box” approach (i.e. we can not
see the code, but we expect the code to work
per the specifications).

3. testing approach
the system test team will begin designing their detailed test plans and test cases, as the
_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 9 of 15

development team is designing and coding. defect tracker will be used to enter the test cases
and to track the defects. it can be accessed from http://www.defecttracker.com.

the builds will be delivered to system test via visual source safe (see cover page for vss location)
drops coordinated by the development team. the development team will be responsible for
installing the partial new builds into the existing structure of the system test environment, and
updating the client machines if necessary. build notes with all changes since the last drop and a
list of all files to be delivered will accompany each build drop.

once the build is dropped by the development team, a series of scripts, called the smoke test, will
be run to ensure that the shipment from development is in a state that is ready for testing. the
smoke test scripts will test the basic functionality of the system. these scripts may be automated
once they are successfully performed manually. if an excessive number of smoke test items fail,
the product will be shipped back to development and no testing will begin until the smoke test
passes.

once the first drop begins, triage meetings will be held to discuss the bug list with the
project/development manager. triage meetings are used to prioritize, set priority and severity and
assign bugs. each week following the first drop, additional drops will be delivered to system test
to test the bugs fixed from the prior drops.

defect tracker will be used to track, report and analyze bugs. prior to triage, a defect tracker
protocol document will be distributed to the project and development manager to ensure that
everyone understands how to use defect tracker and how to effectively enter bugs.

3.1 test deliverables


below are the deliverables for each phase of the project.

phase deliverable responsible

pre-baseline project initiation. upon receipt of a test lead


functional specification, project initiation will
be performed. this includes finding a test
lead for the project and setting up a project
in defect tracker.
pre-baseline kick off meeting. this is done to familiarize test lead
the project manager with the test
methodology and test deliverables, set
expectations, and identify next steps.
pre-baseline functional requirement scrubbing. attend test lead
meetings to create functional specifications.
offer suggestions if anything is not testable
or poorly designed.

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 10 of 15

pre-baseline create pre-baseline documents test lead

test plan. the test plan will break


functionality into logical areas (most often
specified in the functional specification).
once completed, the project manager,
development manager, user project
manager, and production support manager
will review it. once reviewed and amended,
it must be approved and signed by the test
lead, project manager, development
manager, production support manager, and
user project manager.

create testing project plan and budget.


before completing the project plan, the
development project plan must be completed
as to know what dates we are being asked to
adhere to. this also guides us in determining
if the test estimates are reasonable. the
project plan will be detailed, relating back to
the functional specification and test plan.

post-baseline create test cases. once the detailed test tester, guided
plan has been created and reviewed by the by the test lead
test and development teams, test cases are
created. test cases are stored in defect
tracker. each test case includes the steps
necessary to perform the test, expected
results and contains (or refers to) any data
needed to perform the test and to verify it
works.
post-baseline project and test plan traceability. review test lead
the test plan to ensure all points of the
functional specification are accounted for.
likewise, ensure that the test cases have
traceability with the test plan and functional
spec. finally, that the project plan has
traceability with the test plan.
once testing begins triage. once testing begins, triage meetings test lead
will be held to prioritize and assign bugs.
this is conducted by the test lead and will
include the project manager and
development lead. once user testing begins,
the user project manager will also attend.
triage meetings are usually held 2 to 5 times
per week, depending on the need.
bi-weekly update project plan and budgeting. test lead
update the project plan with % complete for
each task and enter notes regarding critical
issues that arise. also determine if the test
effort is on budget.

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 11 of 15

weekly weekly status report. the project manager test lead


will specify who is to receive the weekly
status report. this report will identify the %
complete of all tasks that should be due by
that week, the tasks to be worked on in the
next week, metrics indicating the testing
statistics (once testing begins), budgeting
information, and any issues or risks that
need to be addressed. this information can
be generated from defect tracker.

before sending to uat release to uat report. once system testing test lead
is done and the code is ready for user
acceptance testing (uat), the test lead will
create a report that summarizes the activities
and outlines any areas of risk. it will be
created based on a template (accessible
from our web site), and all assumptions will
be listed.
project completion post mortem analysis. this is done to test lead,
analyze how well our testing process development
worked. team, user
team

3.2 defect tracker setup


the test lead will create a project for defect tracker so that bugs can be tracked. the
project name in defect tracker will be [projectname].

4. release criteria

4.1 test case pass/fail criteria

the feature will pass or fail depending upon the results of testing actions. if the actual
output from an action is equal to the expected output specified by a test case, then the
action passes. should any action within a test case fail, the entire feature or sub-feature
fails. the specific criteria for test case failure will be documented in defect tracker.

if a test case fails, it is not assumed that the code is defective. a failure can only be
interpreted as a difference between expected results, which is derived from project
documentation, and actual results. there is always the possibility that expected results
can be in error because of misinterpretation, incomplete, or inaccurate project
documentation.

pass criteria:
•all processes will execute with no unexpected errors
•all processes will finish update/execution in an acceptable amount of time based on
benchmarks provided by the business analysts and documented by the development
team

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 12 of 15

4.2 suspension criteria for failed smoke test


the system test team may suspend partial or full-testing activities on a given build if any
of the following occurs:

• files are missing from the new build.


• the development team cannot install the new build or a component.
• the development team cannot configure the build or a component.
• there is a fault with a feature that prevents its testing.
• item does not contain the specified change(s).
• an excessive amount of bugs that should have been caught during the
component/unit test phase are found during more advanced phases of testing.
• a severe problem has occurred that does not allow testing to continue.
• development has not corrected the problem(s) that previously suspended testing.
• a new version of the software is available to test.

4.3 resumption requirements


the steps necessary to resume testing:

• clean previous code from machines.


• re-install the item.
• the problem encountered resulting in suspension is corrected.

resumption of testing will begin when the following is delivered to the system test team:

• a new build via visual source safe.


• a list of all bugs fixed in the new version.
• a list of all the changes to the modules in the new version and what functionality they
affect.

4.4 release to user acceptance test criteria


the release criteria necessary to allow the code to migrate to user acceptance testing are
as follows:

• there are no open bugs with a severity 1 or 2


• test cases scheduled for both integration and system test phases have passed.
• successfully passes the final regression testing.
• there are no discrepancies between the master setup and the version used during
the final regression testing.

4.5 release to production criteria


the release criterion necessary to allow the code to migrate to production is as follows:

• there are no open bugs with a severity 1 or 2


• test cases scheduled for both integration and system test phases have passed.
• successfully passes the final regression testing.
• there are no discrepancies between the master setup and the version used during
the final regression testing.
• the user acceptance test was successfully completed
• the user acceptance criteria was met.

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 13 of 15

5. hardware

5.1 servers

machine name purpose owner hardware


systest system test machine john doe 2 x p133 256mb ram
uat user acceptance test jane doe 2 x p200 256mb ram
machine
prod production machine kevin doe 4 x p600 1gb ram

5.2 server configuration:

systest:
sql server service pack 2
nt service pack 5
uat:
sql server service pack 2
nt service pack 5
prod:
sql server service pack 2
nt service pack 5

5.3 other servers:


none

5.4 clients:

machine name machine operating hardware


location system
clienttester1 (file server) server room 14 nt 3.51 server piii-400 65mb ram
clienttester2 (client) server room 12 95 us english pii-500 16mb ram
clienttester3 (client) server room 10 95 uk english p-300 40mb ram
clienttester4 (client) server room 14 95 german p-100 32mb ram
clienttester5 (client) server room 14 nt 3.51 wkst p-200 32mb ram
clienttester6 (client) test lab nt 4.0 wkst p-300 48mb ram

6. project plan
the project plan is created using project 98 and is linked into the project manager’s project plan as
to eliminate the need to keep a separate copy updated for the project manager.

note: make sure that the project plan appears on your web site and that it contains cost
information as well as dates. include a summarized (collapsed) view of the project below
followed by a view of the cost information. below is an example. you can copy/paste this
information from project 98 by including these fields next to each other in the gantt view and
selecting the rows and using the copy/paste functionality.

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 14 of 15

6.1 critical dates

task work begin end


[product name] project timeline "2,980h" mon 11/24/97 fri 5/8/98
test planning "1,254h" mon 11/24/97 fri 5/8/98
resource availability 40h mon 12/29/97 wed 1/14/98
test lead activities 452h mon 11/24/97 fri 5/8/98
box 1 - detailed test plans/test cases 762h mon 11/24/97 fri 2/6/98
**prodsupp** po copy 48h mon 1/5/98 mon 1/12/98
regression testing i 232h mon 11/24/97 mon 1/12/98
functionality 1 58h tue 1/13/98 thu 1/22/98
functionality 2 224h wed 12/31/97 fri 2/6/98
interfaces / end-to-end testing 200h thu 1/15/98 thu 2/5/98
system testing "1,102h" mon 2/9/98 thu 4/30/98
original test items 344h mon 2/9/98 wed 3/4/98
receive drop from development 0h mon 2/9/98 mon 2/9/98
install build 16h mon 2/9/98 tue 2/10/98
build verification test 32h wed 2/11/98 thu 2/12/98
regression testing 48h fri 2/13/98 fri 2/20/98
functionality 1 24h mon 3/2/98 wed 3/4/98
functionality 2 136h wed 2/11/98 fri 2/27/98
interface testing 88h wed 2/11/98 wed 2/25/98
extended test cases 758h mon 2/23/98 thu 4/30/98
end to end testing 472h thu 3/19/98 fri 4/17/98
reiteration of bug fixes 152h fri 4/10/98 thu 4/30/98

6.2 budget information


note: ,include a 20% contingency in your plan by either listing specific risks that
account for that 20% or by increasing your hours on each task as to assume the 20%
contingency.

task total cost


[productname] project timeline "$164,820.00"
test planning "$78,520.00"
resource availability "$2,000.00"
test lead activities "$38,420.00"
box 1 - detailed test plans/test cases "$38,100.00"
**prodsupp** po copy "$2,400.00"
regression testing i "$11,600.00"
functionality 1 "$2,900.00"
functionality 2 "$11,200.00"
interfaces / end-to-end testing "$10,000.00"
system testing "$55,100.00"
original test items "$17,200.00"
receive drop from development $0.00
install build $800.00
build verification test "$1,600.00"
regression testing "$2,400.00"
functionality 1 "$1,200.00"
functionality 2 "$6,800.00"
interface testing (schema changes) "$4,400.00"
extended test cases "$37,900.00"
end to end testing "$23,600.00"
reiteration of bug fixes "$7,600.00"

brought to you by software planner (www.softwareplanner.com). deliver software solutions to


specification, on time and on budget with software planner. tracks customer requirements, test

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com
test design page 15 of 15

cases, defects, project deliverables, and your schedule (appointments and to-do lists) via the
web. discussion forums and contact management also built in. free 2 week trial.

software planner guided tour pricing brochure free trial software planner details

_____________________________________________________________________________
developed by pragmatic software co., inc. http://www.pragmaticsw.com