Beruflich Dokumente
Kultur Dokumente
i+ W
2 j+ W
2
X X
Vy (i, j) = δx2 (u, v)δy2 (u, v) (4)
u=i− W W
2 v=j− 2
1 Vy (i, j) π
θ(i, j) = tan−1 + , (5)
2 Vx (i, j) 2
where θ is the direction of the block centered at pixel
(x, y), it is orthogonal to the local orientation of each
Figure 1. Fingerprint image classes: (a) Right block.
loop, (b) Whorl, (c) Arch, and (d) Left loop.
4. The blocks with slope values ranging from 0 to π2 are
located. Then, trace a path down until you encounter
a slope that is not ranging from 0 to π2 and mark that
2 Algorithm block. The block with the highest number of marks
yields the HRC region.
The algorithm has two stages. In the first stage, the HRC
region [1] is extracted. The core point detection is elimi- 2.2 Ridge Tracing
nated in order to reduce computation. In the second stage,
instead of tracing the ridges around the core point, the ridges Once, the HRC region is extracted, starting from the
in the HRC region are traced. Since the core point lies in top–most center pixel in the HRC region, we search for
the HRC region, the classification accuracy is not affected. ridges in the downward direction. A tracing is carried out
Once the ridges are traced, the classification is done on the in both the directions of the ridge, as shown in Figure 2
basis of the vectors drawn at the end points, and other pa- (a). The tracing is carried out on the thinned–binary image.
rameters, which are mentioned in Subsection 2.2. The extracted binary image is the output of the Gabor filter.
The tracing classifies the fingerprint images into different
2.1 High Ridge Curvature (HRC) Extrac- classes.
tion
The tracing is done simply by using the connectivity of
This stage employs the calculation of the HRC region as the pixels. While tracing, if a bifurcation is encountered
described in [1]. The algorithm used is as follows: at any point, one of the branches is traced till the end,
after which, the bifurcation point is reconsidered, and the
1. A block size of W XW is centered at pixel (i, j) in the second branch is also traced till the end. The pixels are
normalized fingerprint image. remembered by marking the pixels, so that re–tracing is
2. For every pixel in the block, calculate the gradients avoided. While tracing any fingerprint image, the number
along the X and the Y directions. Call them as δx and of interceptions made between the ridges and the vertical
δy respectively. The horizontal Sobel operator used to line, formed by joining the top–most and the bottom-most
compute δx is: center pixels in the HRC region, is calculated. The reason
for this is given below. When a bifurcation is encountered,
the tracing of the second branch does not add to the count
1 0 −1
2 0 −2 (1) of the number of interceptions.
1 0 −1 Tracing more than one ridge: this is done in order to
avoid errors in classification. For instance, when a ridge
The vertical Sobel operator used to compute δy is: ends abruptly and unexpectedly, due to physical damages
on the finger or some other reasons, more than one ridge is
1 2 1 traced in order to retain good classification accuracy. This
0 0 0 (2) is a rare case, though. The Gabor filter based extraction
−1 −2 −1 is very efficient and hence takes care of most of the
236
discontinuous ridges. In this paper, for every image, we for a loop.
have considered tracing three ridges.
4. Whorl: In case of a whorl, the classification is done on
The conditions based on which the images are classified the basis of success in any one or all the below men-
are: tioned conditions:
1. Right loop: A ridge starting from the HRC is traced in • While tracing, the vertical line (formed by join-
both the directions, and the vectors are drawn at such ing the top–most and the bottom–most center
points as shown in 2 (b). The angular magnitude be- pixels in the HRC region) is intercepted many
tween the ridges is very small as compared to an arch times, which is not the case for a loop, or an arch.
(where the angular magnitude between the vectors is Typically, the value for the number of such inter-
large). The calculation of the HRC eliminates the er- ceptions for a loop is one in almost all the cases,
rors significantly. Outside the HRC, or significantly and zero for an arch. This is shown in Figure 3.
away from the core point, the tracing fails to classify • While tracing, if the X and the Y coordinates of
the images. For a fingerprint loop, a ridge outside the the points are monitored by taking a simple av-
HRC may help in wrong classification. The vectors erage over the scanned pixels, the centroid (aver-
at the end points may have a large angular magnitude. age) lies very close to the core point, which is not
This is because, the loop exists only close to the core the case for a loop, or for an arch.
point. This is shown in Figure 2 (c). The difference
between a right and a left loop, in order to classify, can
be calculated by taking the sum of the two vectors. The
resultant vectors points to the right of the starting trace
point in case of a Right loop, and lies to the left of the
starting trace point in case of a left loop.
237
Class Images Taken/Correctly Classified References
Left Loop 40/40
Right Loop 40/40 [1] A. V. C. Extracting and enhancing the core area in finger-
Whorl 40/38 print images. International Journal of Computer Science
Arch 40/40 and Network Security, 7(11):16–20, November 2007.
[2] S. C. Dass and A. K. Jain. Fingerprint classification using
orientation field flow curves. Indian Conference on Com-
Table 1. The classification results so ob- puter Vision, Graphics and Image Processing, December
tained. 2004.
[3] A. P. Fitz and R. J. Green. Fingerprint classification us-
ing a hexagonal fast fourier transform. Pattern Recognition,
29(10):1587–1597, October 1996.
the left loop is the resultant vector as discussed in Section [4] J.-H. Hong, J.-K. Min, U.-K. Cho, and S.-B. Cho. Finger-
2.2. print classification using one-vs-all support vector machines
dynamically ordered with naive bayes classifiers. Pattern
3.2 The Number of Intersections Recognition, 41(2):662–671, February 2008.
[5] A. Jain, L. Hong, and R. Bolle. On-line fingerprint verifi-
This was seen to be nil for the fingerprint arch images. cation. IEEE Trans. Pattern Analysis and Machine Intelli-
But for the fingerprint loop images, this value was at most gence, 19(4):302–314, April 1997.
[6] K. Karu and A. K. Jain. Fingerprint classification. Pattern
two. And for fingerprint whorl images, this was more than
Recognition, 29(3):389–404, March 1996.
three in all the cases. [7] J. V. Kulkarni, J. R, S. N. Mali, H. K. Abhyankar, , and
R. S. Holambe. A new approach for fingerprint classification
4 Results based on minutiae distribution. IJCS, 1(4):253–259, 2006.
[8] H. L., W. Y, and J. A. K. Fingerprint image enhancement:
The fingerprint database that we have used is the Algorithm and performance evaluation. IEEE Transactions
on Pattern Analysis and Machine Intelligence, 20(8):777–
publicly available standard FVC–2004 fingerprint data
789, August 1998.
– http://bias.csr.unibo.it/fvc2004/download.asp. This [9] Y. Qi, J. Tian, and R.-W. Dai. Fingerprint classification sys-
database consists of a total of 320 images (8 copies per per- tem with feedback mechanism based on genetic algorithm.
son). Some of the FVC–2002 images were also used for International Conference on Pattern Recognition, 1:163–
classification. We have manually selected the images under 165, August 1998.
the four classes. There were 40 images for each class. The [10] N. K. Ratha, K. Karu, S. Chen, and A. K. Jain. A real-
algorithm was implemented on Matlab R2007a, on a Pen- time matching system for large fingerprint database. IEEE
tium IV processor (3 GHz), and a 1GB RAM. Transactions on Pattern Analysis and Machine Intelligence,
The time taken for the extraction of the binary image 18(8):799–813, August 1996.
[11] B. G. Sherlock and D. M. Monro. A model for interpret-
from the raw fingerprint image was around 3–4 seconds.
ing fingerprint topology. Pattern Recognition, 27(7):1047–
The extraction stage can be saved and matching may be con-
1055, July 1993.
tinued from this stage once the classification is completed. [12] T. Srinivasan, S. Shivashankar, A. V, and B. Rakesh.
This saves a certain amount of time. The classification of An adaptively automated five-class fingerprint classification
a single fingerprint image takes negligible amount of time, scheme using kohonen’s feature map. International Confer-
since the second stage of the algorithm is geometry based ence on Intelligent Systems Design and Applications, 1:72–
and simple. The classification results are given in the Ta- 77, October 2006.
ble 1. The overall classification accuracy was observed to [13] O. Zhang and H. Yan. Fingerprint classification based on
be 98.75%. The only class which had less than 100% was extraction and analysis of singularities and pseudo. Pattern
whorl. Recognition, 37(1):2233–2243, November 2004.
238