Beruflich Dokumente
Kultur Dokumente
Abstract
Biometrics identification is an emerging technology for
solving security problems in our networked society. A new
branch of biometric technology, palmprint recognition,
whereby the lines and points can be extracted from our
palm for personal identification was proposed several
years ago [1-6]. In this paper, we implement the feature
extraction technique applied to iris recognition [7] on Figure 1. A palmprint image with line definitions.
low – resolution palmprint images. A 2-D Gabor filter is
used to obtain the texture information and two palmprint An on-line palmprint identification system can be
images are compared in term of their hamming distance. examined in terms of five parts functions, summarized
The experimental results show that our method is effective. below:
1) Image acquisition Capture a palmprint with a
1. Introduction palmprint scanner [3].
2) Preprocessing Determine a coordinate system
Palmprint identification can be divided into two and extract the central part of the palmprint
categories, on-line and off-line. For off-line identification, image.
all the palmprint samples are inked, which are then 3) Feature extraction Extract some stable and
transmitted into a computer with a scanner. For on-line unique features.
identification, the samples are captured with a palmprint 4) Pattern matching Decide whether two
scanner which directly connects to a computer. Off-line palmprints are from the same person.
palmprint recognition was the main focus in past 5) Data storage Store the features from the
palmprint research [1-6]. Recently, a CCD camera-based registered images for use in later comparisons.
palmprint capture device has been developed [3]. Fig. 1 An on-line palmprint system runs on at least two of three
shows an image captured with the device. Real-time modes enrollment, identification and verification. In
applications of this on-line palmprint recognition device the enrollment mode, a new user’s palmprint feature is
are possible. Because of the relative high-resolution of the extracted and then stored in a database for later
off-line palmprint image (350 dpi), some techniques comparisons. In the identification mode, an input
applied to fingerprint recognition can be used for the off- palmprint must pass though image acquisition,
line palmprint recognition, where lines, datum points and preprocessing and feature extraction. The feature
singular points are extracted [1, 5-6]. In this paper, we use extracted is compared with all records in a given database.
low-resolution on-line images (65 dpi) so we attempt a The difference between the identification and verification
new approach to extract texture features from palmprint modes is that in the verification mode, each user must
images. have a user ID and his/her palmprint can only be
compared with all the features belonging to the same user
ID.
This paper is organized as follows: preprocessing is Step 5: Extract a sub image with a fixed size on the
discussed in Section 2; palmprint feature extraction with basis of the coordination system, which is located at a
texture analysis is explained in Section 3; palmprint certain part of the palmprints for feature extraction (see
matching and experimental results are given in Section 4 Fig. 2(f)).
and Section 5, respectively; finally, some conclusion is
provided in Section 6.
4. Palmprint Matching
i =1 j =1
Do = , (12)
2N 2
where PR (QR) and PI (QI) are the real part and the
imaginary part of P (Q), respectively; the result of
Boolean operator, “ ⊗ ”, is equal to zero if and only if the (a)
two bits, PR(I)(i,j), equal to QR(I)(i,j), and the size of the
feature matrixes is N×N. It is noted that Do is between 1
and 0. For a perfect matching, the matching score is zero.
In order to further reduce the variation of translation, Eq.
12 can be improved to form Eq. 13: (b)
min( N , N + s ) min( N , N + t )
PR (i + s, j + t ) ⊗ QR (i, j ) +
∑ ∑
i = max(1,1+ s ) j = max(1,1+ t ) PI (i + s, j + t ) ⊗ QI (i, j )
Dmin = min
s < S , t <T 2H (s)H (t )
, (13) (c)
Figure 4. Original images and their features (a) Original
images with the effective areas specified, (b) Features
from the real part of the Gabor filter c) Features from the the false reject rate is 0.91% with 0% false accept rate.
imaginary part of the Gabor filter The experimental results are encouraging.
The performance of the proposed method under
different thresholds that control the false accept rate and
Acknowledgments
The authors would like to thank Wenxin Li and
false reject rate is shown in Table 1. In this experiment,
Qingyun Dai at Biometrics Research Centre for their
the imposter distribution and genuine distribution are
palmprint image collection. The work is partially
generated by 1,083 and 769 comparisons, respectively.
supported with UGC (CRC) fund from the Hong Kong
Fig. 5 shows the two distributions. When the threshold is
SAR Government and central fund from The Hong Kong
0.335, the false reject rate is 0.9% with 0% false accept
Polytechnic University.
rate. Some images are still not recognized with the
proposed method because of non-linear distortion.
References:
Table 1. Experimental results of the selected thresholds.
[1] D. Zhang and W. Shu, “Two novel characteristics in
Threshold False accept rate False reject rate palmprint verification: datum point invariance and line
(%) (%) feature matching,” Pattern Recognition, vol. 32, no. 4, pp.
0.325 0.00 1.56 691-702, 1999.
0.335 0.00 0.91 [2] N. Duta, A. K. Jain, and Kanti V. Mardia, "Matching
0.345 0.37 0.65 of Palmprint", To appear in Pattern Recognition Letters,
0.355 0.92 0.65 2001.
[3] D. Zhang, Automated Biometrics – Technologies and
0.365 2.50 0.39
Systems, Kluwer Academic Publishers, 2000.
0.375 5.36 0.13
[4] J. You, W. Li and D. Zhang, "Hierarchical pamlprint
identification via multiple feature extraction". To appear
14.0
in Pattern Recognition
[5] W. Shu and D. Zhang, “Automated personal
12.0 Imposter identification by palmprint,” Optical Engineering, vol. 37
no. 8, pp.2,659-2,362, 1998.
10.0 [6] W. Shi and D. Zhang, "Automatic palmprint
verification". International Journal of Image and
Percentage (%)