Beruflich Dokumente
Kultur Dokumente
Pergamon
0031-3203(95)00092-5
A NEURAL NETWORK APPROACH TO OFF-LINE
SIGNATURE VERIFICATION USING DIRECTIONAL PDF
(Received 24 February 1994; revised 11 April 1995; receivedfor publication 3 July 1995)
Abstraet--A neural network approach is proposed to build the first stage of an Automatic Handwritten
Signature Verification System. The directional Probability Density Function was used as a global shape
factor and its discriminating power was enhanced by reducing its cardinality via filtering. Various
experimental protocols were used to implement the backpropagation network (BPN) classifier. A comparison, on the same database and with the same decision rule, shows that the BPN classifier is clearly better than
the threshold classifier and compares favourably with the k-Nearest-Neighbour classifier.
Pattern recognition
Classifiers
Neural networks
Backpropagation
Automatic signature verification
Directional probability density function
1. INTRODUCTION
The design of a complete Automatic Handwritten
Signature Verification System (AHSVS) that will be
able to take into account all classes of forgeries is a very
difficult task. ~1) Indeed, a complete AHSVS should be
able to discriminate between genuine signatures and
the following forgeries: random forgeries, characterized
by a different semantic meaning and consequently by
a different overall shape when compared to genuine
signatures; simple forgeries, with the same semantic
meaning as genuine signatures but an overall shape
that differs greatly; freehand and simulated forgeries,
produced with the a priori knowledge of both the
semantic meaning and the graphical model of a target
signature by a skilled or an occasional forger respectively; finally, tracing forgeries and photocopies, with
almost the same graphical aspect as genuine signatures, but with different pseudo-dynamic properties
such as dissimilarities in grey-level-relatedfeatures like
texture, contrast.
In such a system, in order to take into account all
classes of forgeries, the decision is made only at the end
of the verification process. Consequently, this approach is a very costly solution in terms of computational resources and in terms of related algorithmic
complexity. 12) Since random and simple forgeries represent almost 95% of the cases generally encountered
in practice, 13'41a better solution might be to subdivide
the verification process in such a way to rapidly eliminate gross forgeries. Thus, a two-stage AHSVS seems
to be a more practical solution, where the first stage
would be responsible for this rapid elimination and the
second stage used only in complicated cases. The
415
416
A standard signature database of 40 signatures written by 20 individuals (800 images) is used in this study.
For the choice of pre-treatment (Section 3), the first 20
and the last 20 signatures of each writer were used to
build the reference set and the test set, respectively. For
the design of the BPN (Section 4), the first 20 signatures
were used to build the training set, but the last 20
signatures were divided into two groups to build the
test set (first 10) and the validation set (last 10).
2.2. Reference and trainino sets
The reference set for the k N N classifier and the
training set for the BPN classifier were composed of
280 examples: 140 related to the genuine signatures of
an individual (class 0)1) and another 140 related to
random forgeries defined as a subset of signatures from
all the other individuals (class 0)2). For each writer, the
140 examples in class 0)1 were obtained by rotating all
the full directional P D F issued from the 20 reference
signatures in a circular fashion from - 6 to + 6 in 2
increments. In the case of class 0)2, the 140 examples
were obtained by choosing seven or eight reference
signatures at random from 19 other individuals. The
cardinality of classes 0)1 ahd 0)2 is equal in order not to
favour one class over the other.
3. CHOICE OF
417
0.030
0.025
..............................................................
0.020
.
Full PDF
Pretreated PDF
0.015
0.010
0.006
0.000
I
20
40
60
80
100
120
140
160
180
degrees
Fig. 1. Full and pretreated (with a 6B7 filter) directional PDF of the handwritten signature shown in the
upper part of this figure.
418
J.-P. D R O U H A R D et al.
419
Mean
Mean +1- SD ........................................
11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
.........
0:i
a)
"i
.Q
O
tn...................................................................................................
....................................
. . . . . . . . . . . . . . . . . . . . . . . . . . . . .
........
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
..........................
nera,,zaiion
..........
. . . . . .
rig
2
1
0
..............................................................................
I
100
200
300
Y" . . . .
I
400
500
600
700
800
900
1000
Number of presentations(NBP)
Fig. 2. Mean global error in memorization (etm) and generalization (eta) as a function of the number of
presentations (NBP). The bold line shows the mean value and the fine lines on both sides show the standard
deviation (SD).
420
Definition
RC1
RC2
MAX{ol,o2} < TM
ABS(oj - - 02) <~ T A
RC3
RC4
RC5
MAX{o1,%} <TM
and
MAX{o~,o2} - MIN{ot, o2}
< TR~
MAX{ol,o2}
RC6
RC2
RC1
Io
...................................
,o
t.00
.[iiiiiiiiiiii i iliili
II
iii...i.iiiiiiii......iii...iiiiii.
ii.iiiiii
....
.
1 0.00
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
"| 0.04
" 10.02
'O410
1.0
0
0.0
0.t
0.2
. . . . .
- 0.3
0.4
0.
TM
0.6
0.7
0,0
@.|
~. . . . . . . . . . . . . . . . . . . . . . . . . . . .
0.0
0,1
0.2
0.3
0.4
05
TA
0.8
0.7
0.11
0.g2
0.g
~9o
1.0
RC3
1o
1.00
0.N
lO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
iI I--I--I-I-I I--
if.00
o.u
O.M
ul"
4
0.114
0,112
0.90
10
20
~KI
40
50
elO
70
tO
I10
0.0
0.t
0.2
O~
O~
0.5
0.8
0.7
0.0
O~
Fig. 3. Evaluation of the rejection criteria RC I, RC2, RC3 and RC5 by means of the total error (E,) and
rejection (Re)rates and of the reliability factor (RF) for various values of their corresponding thresholds TM,
TAand TR 1.
1.0
421
5. DISCUSSION
422
With rejection
10
.........................................................................................
1.00
....................
RF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
12
16
0.94
'. . . . . . . . . . . . . . . . . . . . . . .
20
24
28
0.96
32
36
0.90
Nh
Without rejection
10
.....................................................................................
. ~ I
1.00
l:
ul"
I I
/ri211
T
"
.........................................................................................
'
12
1.92
16
'
20
24
'
28
'
'
32
'
1.90
36
Nh
Fig. 4. Effect of the number of hidden n e u r o n s (Nh) on the performance (E, R, and RF) of the BPN evaluated
without a rejection criterion or with the rejection criterion RC 1 with a threshold level T , of 0.94.
6. CONCLUSIONS
The main objective of this work was to determine
whether or not a BPN classifier could be used in the
design of the first stage of a complete Automatic
Handwritten Signature Verification System (AHSVS).
To do this, we have chosen the directional Probability
Density Function (PDF) as a global shape factor and
the completely connected feed-forward neural network with the classical backpropagation learning algorithm referred to as the Backpropagation Network
(BPN).
However, the dimension (180) of the P D F is too
large to be properly manipulated by a BPN classifier.
Consequently, in a first attempt and by means of the
k Nearest Neighbour (kNN) classifier, we have determined the pre-treatment that improves the classification while decreasing the dimension of the input
vector. The results in Section 3 show that the pretreatment with an integration filter with a step of ten
gave the best result.
In the case of the BPN classifier, the training phase is
crucial and difficult to control. The major difficulty is
to decide when to stop training. In Section 4.2, we
defined a stopping criterion based on a measure of the
performance of the BPN classifier both in memorization (em) and in generalization (et,), as well as on
a maximum number of presentations (NBP) of the
training set. Using an experimental protocol, we decided that the training phase would be stopped when
423
424
25. H. Akaike, A new look at the statistical model identification, IEEE Trans. Automat. Control 19, 716-723 (1974).
26. T. Ash, Dynamic node creation in backpropagation networks, Proc. Int. Joint Conf. Neural Networks 2, 623 628
(1989).
27. M. Hagiwara, Novel backpropagation algorithm for re-
About the Author--JEAN-PIERRE DROUHARD received the M.Sc. degree in electronic, electrical and
control engineering from the University of Caen, France, in 1972 and the M.Sc.A. and Ph.D. degrees in
biomedical engineering from the Ecole Polytechnique de Montr6al in 1975 and 1979, respectively. From 1980
to 1989 he was a research associate for the Biomedical Engineering Institute of the Ecole Polytechnique de
Montreal. In 1989, he joined the staff of the l~cole de Technologie Sup6rieure, Universit6 du Quebec,
Montr6al, P.Q., Canada, where he is currently Professor in the D6partement de G6nie de la Production
Automatism. His current research interests are in the application of the artificial intelligence technics such as
expert systems, artificial neural networks and fuzzy systems in the fields of pattern recognition and computer
vision.
About the Author--ROBERT SABOURIN received B.ing., M.Sc.A. and Ph.D. degrees in electrical
engineering from the l~cole Polytechnique de Montr6al in 1977, 1980 and 1991 respectively. In 1977 he joined
the physics department of the Universit6 de Montr6al where he was responsible for the design and
developmentof scientific instrumentation for the Observatoire du Mont M6gantic. In 1983, he joined the
staff of the Ecole de Technologie Sup~rieure, Universit~ Qu6bec, Montr6al, P.Q, Canada, where he is
currently Professor in the D6partement de G~nie de la Production Automatis6e. His research interests are in
the areas of computer vision, scene understanding, segmentation, structural pattern recognition, neural
networks and fuzzy systems, character recognition and signature verification.
About the Author--MARIO GODBOUT received the BAng. degree in Production Automatis~ from the
l~cole de Technologie Sup6rieure de Montr6al in 1992. He is currently making a master degree at that same
school. His research interests are in the field of artificial intelligence.