Beruflich Dokumente
Kultur Dokumente
com
Manual Testing
Software Quality:
Software satisfies quality only when it meets to Customer
Requirement / Customer Satisfaction / Customer Expectations. Meets
Customer Requirement refers to proper output & Customer
Expectations refers to extra characteristics, good interface, speed,
privacy, security, easy to operate, good functionality.
Non-technical reasons: Cost of product & Time to market
Software Quality Assurance:
SQA are the Concepts to be followed by company to develop the
software. An SQA team is responsible for Monitoring & Measuring the
strength of developing processes.
Software Project:
A set of problems assigned by the client, who can be solved by
the software people through the process of software Engineer called
Software project. In short, the problem, the people, the process called
project. Software related problem is solved by software engineers
through software engineer process is software project.
Software Development Life Cycle / Life Cycle Development:
Stages involved in software project development
1)
2)
3)
4)
5)
6)
gcrindia@gmail.com
of
customer
Testing:
gcrindia@gmail.com
Development
V MODEL TESTING
Testing
Assesment development plan
Prepare Test plan
Requirement phase testing
Information
Install Build
Maintenance
gcrindia@gmail.com
S/w RS
HLDs
Integration Testing
LLDs
Unit Testing
Coding
Are
Are
Are
Are
Are
they
they
they
they
they
complete
met with right requirements of client / customer
achievable w.r.t technology
reasonable w.r.t time & cost
testable
gcrindia@gmail.com
After completion of Analysis phase & their reviews, our Projectlevel designers will start logical design of application in terms of
External & Internal design (HLDs & LLDs). In this stage, they
conduct reviews for completeness & correctness of designed
documents. This review focuses on below factors:
1)
2)
3)
4)
5)
3)
3) Mutation Testing
gcrindia@gmail.com
Mai
n
Stub
Sub 1
Sub 2
Top-Down Approach
In this approach, testing is conducted on Main module without
conducting testing to some of sub-modules. From the above diagram,
a Stub is a temporary program instead of under constructive submodule, it is known as called program.
gcrindia@gmail.com
b)
Bottom-Up Approach
Main
Driver
Sub 1
Sub 2
Bottom-Up Approach
In this approach, testing is conducted on sub-modules without
conducting testing on main modules. From the above diagram, a
Driver is a temporary program instead of main module, it is known as
calling program.
c) Sandwich or Hybrid Approach
Main
Driver
Sub 1
Sub 2
Stub
Sub 3
Sandwich / Hybrid Approach
gcrindia@gmail.com
Usability Testing
Functional Testing
Performance Testing
Security Testing
From Above 1 & 2 are Core level and 3 & 4 are Advance level
During Usability testing, Testing team validates UserFriendliness of screens.
During Functional testing, TT validates the correctness of
customer requirements
During Performance testing, TT estimates speed of processing
During Security testing, Testing team validates privacy to User
operations
1)
Usability Testing
gcrindia@gmail.com
Functional Testing
gcrindia@gmail.com
g) Installation Testing
During this test, Testing team validates whether application
build along with supported softwares into customers site like
configured systems. During this test, Testing team observes below
factors.
Setup program execution to start installation
Easy Interface
Amount of disk occupied after installation
h) Parallel / Comparative Testing
During this test, Testing team compares application build with
competitive products in market.
i) Sanitation / Garbage Testing
During this test, Testing team tries to find extra features in
application build w.r.t customer requirements.
10
gcrindia@gmail.com
* Defects
During this test, Testing team reports defects to developers in
terms of below categories
1. Mismatches between expected & actual
2. Missing functionality
3. Extra functionality w.r.t customer requirement
* Manual v/s Automation
A Tester conducts any test on application build without using
any testing tool is called Manual testing, if any testing tool is used
then it is called Automation testing
In common testing process, Testing Engineers are using test
Automation w.r.t test impact & criticality. Impact -> test repetition &
Criticality -> complex to apply test manually. Due to these two
reasons testing people are using test Automation.
j) Re-testing
The re-execution of a test with multiple test data to validate a
function, e.g. To validate multiplication, Test Engineers use different
combinations of input in terms of min, max, -ve, +ve, zero, int, float,
etc.
k) Regression Testing
The re-execution of test on modified build to ensure bug fixing
work & occurrence of any side effects, Test Engineers conducts this
test using Automation
l) Error, Defect & Bug
A mistake in code is Error, due to errors in coding, Test
Engineers are getting mismatches in application build are defects, if
the defects are accepted by developers to be solves then it is Bug.
11
gcrindia@gmail.com
Testing Documents
12
gcrindia@gmail.com
Test Policy
Test Strategy
Test
Methodology
Test
Plan
Test Cases
Test
Procedure
Test Script
Defect
Report
Final Test Summary
Report
Above Figure, shows the various levels of documents prepared
at project testing. Test Policy is documented by Quality Control. Test
Strategy & Test Methodology are documented by Quality Analyst or
Project Manager. Test Plan, Test Cases, Test Procedure, Test Script &
Defect Report are documented by Quality Assurance Engineers or
Test Engineers.
Test Policy & Test Strategy are Company Level Documents. Test
Methodology, Test Plan, Test Cases, Test Procedure, Test Script,
Defect Report & Final Test Summary Report are Project Level
Documents.
13
gcrindia@gmail.com
1)
TEST POLICY:
Testing Process
Testing Standards :
(Functional points)
Testing Measurements
TEST STRATEGY:
14
gcrindia@gmail.com
15
gcrindia@gmail.com
15.
Methodology: Whether our tester are following standards
or not during testing
3) TEST METHODOLOGY:
It is project level document. Methodology provides required
testing approach to be followed for current project. In this level
Quality Analyst select possible approach for corresponding
project testing.
Test
Initiatio
n
Test
Plannin
g
Test
Designi
ng
Test
Executi
on
Test
Closu
re
Test
Reporti
ng
Pet Process:
Process involves experts, tools & techniques. It is a refinement
form of V-Model. It defines mapping between development & Testing
stages. From this model, Organizations are maintaining separate team
for Functional & System testing & remaining stages of testing done by
development people. This model is developed in HCL & recognized by
QA Forum of INDIA.
TESTING PROCESS
16
gcrindia@gmail.com
17
gcrindia@gmail.com
Information Gathering (Business Requirement
Specifications)
Analysis (Software Requirement
Specification)
Desig
n
Test
Initiation
Codin
g
Unit &
Integration
Level 0
Sanity / Smoke / Tester Acceptance Test / Build
Verification Test
Test
Automation
Create Test Suits / Test Batches /
Test Sets
Defect
fixing &
Resolving
Level 1
Select a batch & starts
execution
Developer
s
18
gcrindia@gmail.com
Test
Closure
Level 3
Final Regression / Releasing Testing / Pre-Acceptance / PostMortem testing
User Acceptance
Testing
Sign Off
4)
TEST PLANNING:
Development
document
Test
Responsible
Matrix
1]
Team Formation
Identify Tactical Risks
Prepare Test Plan
Review Test Plan
System Test
Plan
Team Formation:
19
gcrindia@gmail.com
20
gcrindia@gmail.com
Above (3), (4) & (5) decides which module to be tested >
What to test?
06) Approach: List of selected testing techniques to be applied on
above specified
modules in reference to the TRM(Test Responsible
Matrix).
07) Feature pass or fail criteria: When a feature is pass or fail
description
(Environment is good) (After testing
conclusion)
08) Suspension criteria: Possible abnormal situations rose during
above features testing
(Environment is not good) (During testing
conclusion)
09) Test Environment: Required software & Hardware to be tested
on above features
10) Test Deliverables: Required testing document to be prepared
(during testing, the type
of documents are prepared by tester)
11) Testing Task: Necessary tasks to do before start every feature
testing
Above (6) to (11) specifies -> How to test?
12) Staff & Training: Names of selected Test Engineers & training
requirements to them
13) Responsibilities: Work allocation to every member in the team
(dependable modules
are given to single Test Engineer)
14) Schedule: Dates & Times of testing modules
Above (4) specifies -> When to test?
15) List & Mitigation: Possible testing level risks & solution to
overcome them
16) Approvals: Signatures of Test plan authors & Project Manager /
Quality Analyst
4)
21
gcrindia@gmail.com
TEST DESIGNING:
22
gcrindia@gmail.com
BRS
Test
Cases
HLDs
LLDs
Coding
*.exe
From the above model, Test Engineers are preparing Test Cases
depends on corresponding Use Cases & every test case defines a test
condition to be applied.
To prepare test cases, Test Engineers studies Use Cases in
below approach:
Steps:
1) Collect Use Cases of our responsible module
2) Select a Use Case & their dependencies from the list
2.1) Identify entry condition (Base state)
2.2) Identify input required (Test data)
2.3) Identify exit condition (End state)
2.4) Identify output & outcome (Expected)
2.5) Identify normal flow (Navigation)
2.6) Identify alternative flows & exceptions
3) Write Test Cases depends on above information
4) Review Test Cases for the completeness & correctness
5) Goto step (2) until completion of all Use Cases completion
Use Case I:
23
gcrindia@gmail.com
4 chars
=>
3 chars=> fail
5 chars
=>
15 chars
=>
->
17 chars
16 chars
=>
pass
pass
pass
=> fail
pass
ECP
Valid
a-z
A-Z
0-9
Invalid
special chars
blank
24
gcrindia@gmail.com
07) Test afford (person / hr): Time to execute this Test Case e.g. 20
minutes
08) Test duration: Date & Time
09)
Test Setup: Required testing task to do before starts case
execution (pre-requisites)
10) Test Procedure: Step by step procedure to execute Test Case
Step No:
Action:
Input required:
Expected:
5) Actual:
6) Result:
7) Comments:
11) Test Case passes or fails criteria: When this case is pass or fail
Note: Test Engineers follows list of Test Cases along with step by
step procedures only
Example 1:
Prepare Test Procedure for below test cases Successful file save
operation in Notepad .
Ste
p
No
1
2
3
Action
Input
Required
Open Notepad
Fill with text
Click Save Icon or click
File menu
Option & select save
option
Enter File name & Click Unique
Save
File name
Expected
Empty Editor
Save Icon enabled
Save
Dialog
box
appears with
Default file name
Focus to Notepad &
File name
Appears in title bar of
Notepad
25
gcrindia@gmail.com
ECP
BVA(Size / Range)
Vali Inval Minim
Maxim
d
id
um
um
xxx xxxx
xxxx
xxxx
x
DATA MATRIX
Note:
In general, Test Engineers are preparing step by step
procedure based Test Cases for functionality testing. Test Engineers
26
gcrindia@gmail.com
prepare valid / invalid table based Test Cases for input domain of
object testing {Data Matrix }
Note: For examples refer to notes
c) User Interface based test case design (MS-Windows rules)
To conduct Usability Testing, Test Engineers are preparing list
of Test Cases depends on our organization User Interface standards or
conventions, Global User Interface rules & interest of customer site
people.
Example: Test Cases
1) Spelling check
2) Graphics check (Screen level Align, Font style, Color, Size &
Microsoft six rules)
3) Meaning of error messages
4) Accuracy of data displayed
5) Accuracy of data in the database are result of user inputs, if
developer restrict the data in
database level by rounding / truncating then the developer must
also restrict the data in
front-end as well
6) Accuracy of data in the database as the result of external factors.
Ex. File attachments
7) Meaningful help messages (Manual support testing)
BR based coverage
Use Cases based coverage
Data model based coverage
User Interface based coverage
TRM based coverage
27
gcrindia@gmail.com
Source
(Use
Cases, Test
Data model)
Cases
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
xxxx
TEST EXECUTION:
28
gcrindia@gmail.com
Defect
Fixing
Defect
Resolving
Defect Report
Modified build
Level 0
Sanity / Smoke / .
Level 1
Comprehensive
Level 2
Regression
Level 3
Final Regression
29
gcrindia@gmail.com
Server
Build
FTP (file transport
Protocol)
Testing Environment
Testers
30
gcrindia@gmail.com
Understandable
Operatable
Observable
Consistency
Controllable
Simplicity
Maintainable
Automation
Test Automation:
31
gcrindia@gmail.com
Pass
Skip
In queue
In Progress
Blocked
g)
Fail
Closed
Partial
Pass / Fail
Low
Medium
High
Case I:
If development team resolve bugs severity which is high, Test
Engineers re-execute all P0, P1 & carefully selected P2 Test Cases on
modified build
Case II:
Bugs severity is medium, then all P 0, carefully selected P1 &
some of P2 Test Cases
Case III:
Bugs severity is low, then some P0, P1 & P2
Case IV:
32
gcrindia@gmail.com
Gather Regression
Requirement
Test Reporting
Final Regression
7)
Effort Estimation
Plan Regression
TEST REPORTING:
Level - 0
Level - 1
Test Reporting
Level - 2
Level - 3
During comprehensive testing, Test Engineer are reporting
mismatches as defects to developers through IEEE format.
1) Defect ID: Unique No or Name
2) Description: Summary of the defect
3) Feature: Module / Function / Service , in these module TE
found the defect
4) Test Case Name: Corresponding failing test condition
33
gcrindia@gmail.com
Quality
Analyst
If high defect
is rejected
Test Mgr
Test Lead
Test Eng
Project Mgr
Team Lead
Developer
34
gcrindia@gmail.com
New
close
reopen
35
gcrindia@gmail.com
TEST CLOSURE:
BR based coverage
Use Cases based coverage
Data model based coverage
User Interface based coverage
TRM based coverage
2) Bug density
a) Module A has 20% percentile bugs found
36
gcrindia@gmail.com
Bug Description
Feature
Found By
Status (closed / deferred)
Comments
CASE STUDY ON A PROJECT TESTING PROCESS
37
gcrindia@gmail.com
Local host
client
DSN
DB
Windows 2000
This product maintains a default administrator to create new users and every valid
user search data in database.
Activity flow diagram: -
Login
Admin
New
user
creation
Admin
(Or)
Valid user
Search
records
Invalid
user
Re-login
Search keys
Existing DB
FUNCTIONAL POINTS: Login is taking user id and password
User id and password allows alphabets in lower case from 4 to 8 characters long.
New user ids created by administrators only
38
gcrindia@gmail.com
New user creation windows allows unique user id and password with create and
cancel buttons
Search records opened for valid users only
Search records window maintains below search keys
Customer id: 6 digits number (collected from design document
First name: one character in upper or lower
Last name: one to eight characters in lower case characters
Date of birth: dd\mm\yy in numeric
Age: from and to in numeric
Search records windows allows below combination of fields to search records.
Last name
Last with first name
Last name with first name and d-o-b
Last name with first name and age
Customer id only
Search records window consists of start search and stop search buttons.
Allows last name as full or partial
Allows customer id as full or partial with * or wild card
Customer id
2286*----8 6 * ----- Refresh search records window using Alt+ctrl
Display matched records in a pop-up window with ok button after search
completion.
Returns message like too many matches to display when matched records >1000
Test methodology:- (by PM)
Testing stage
Testing Factors
Authorization
Access Control
Audit trail
Continuity to
Correctness
Coupling
System testing
39
gcrindia@gmail.com
Data integrity
Ease of use
Ease of operate
Reliability
Fortable
Performance
Service levels
Maintainable
Methodology
40
gcrindia@gmail.com
Test Engg..
41
gcrindia@gmail.com
Responsibility
Document / Report
Effort
K.Srinaiah
12 hours
Defect Report
Testing
K Srinaiah
K Srinaiah
15. Schedule:
Task
date
Test design
2005
Implement test and
2005
Review
Execute test
2006
Evaluate test
2006
Effort
Start date
End
12 hours
28-12-2005
29-12-
4 hours
29-12-2005
29-12-
24 hours
30-12-2005
01-01-
8 hours
02-01-2006
02-01-
42
gcrindia@gmail.com
Password
valid
invalid
valid
invalid
invalid
Criteria
pass
fail
pass
fail
fail
43
gcrindia@gmail.com
Blank
Valid
value
blank
fail
fail
Test case 5:
Test case 6:
Test case 7
ECP (TYPE)
Min = max = 6
valid
invalid
-------------------------0 9 a-z, A-Z, special characters
Expect * , blank space
Start with *
Exp: 123 , *23,
1*23X
Test case 14: successful entry of first name
Min = max = 1 pass
Ecp
---------------------------
44
gcrindia@gmail.com
o fail
valid invalid
---------------------------a- z
o-9,
A-Z
special characters
---------------valid invalid
----------------a- z
A-Z, o-9,
Special characters
Ecp (type)
----------------valid
invalid
----------------0- 9 a-z, A-Z
special characters
Year: min 00
max 99
Case 17: success entry of age from
min 01
max 22
Ecp(type)
----------------valid invalid
----------------0- 9
a-z, A-Z
Special characters
45
gcrindia@gmail.com
Test case 20: successful display of matched records with full last name and first name
Test case21: successful display of matched records with full last name first name and dob
Test case22: successful display of matched records with full last name first name age
Test case 23: on successful search operation due to invalid combination of filled fields.
Test case 24: unsuccessful search operation due to invalid Dob
Test case 25: unsuccessful search operation due to age from greater than age To
Test case26: successful display of records with search key as partial last name and other
fields
as blank
Test case 27:successful display of records with search key a partial last name and first
name
Test case 28: successful display of records with search key as partial last name, first name
and
Dob
Test case 29: successful display of records with search key as partial last name, first name
Test case 30: successful display of records with search key as customer id
Test case 31: successful display of records with search key as partial customer id with *
as
wild card
Test case 32: Unsuccessful display of records due to no matching records in database
w.r.to
given search keys
Test case 33: unsuccessful display of records due to too many records to display when
no
of matched records greater than 1000
Test case 34 : successful refresh search window with Alt +Ctrl
Test case 35: Successful termination of search operation when click stop search button.
46
gcrindia@gmail.com
Test case 36: successful closing of records window through click o.k after searching.
Test case 37: spelling check in every screen
Test case 38: Graphics check in every screen
Ex: (alignment, font, size, label, graphical, colour, etc)
Usability
Testing
Test case39: Accuracy of data displayed
Ex:Dob
dd\mm\yy
47