Beruflich Dokumente
Kultur Dokumente
Annadurai
P.Latha
plathamuthuraj@gmail.com
Selection .grade Lecturer,
Department of Electrical and Electronics Engineering,
Government College of Engineering,
Tirunelveli- 627007
Dr.L.Ganesan
Assistant Professor,
Head of Computer Science & Engineering department,
Alagappa Chettiar College of Engineering & Technology,
Karaikudi- 630004
Dr.S.Annadurai
Additional Director, Directorate of Technical Education
Chennai-600025
Abstract
Face recognition is one of biometric methods, to identify given face image using
main features of face. In this paper, a neural based algorithm is presented, to
detect frontal views of faces. The dimensionality of face image is reduced by
the
Principal component analysis (PCA) and the recognition is done by the Back
propagation Neural Network (BPNN). Here 200 face images from Yale database
is taken and some performance metrics like Acceptance ratio and Execution time
are calculated. Neural based Face recognition is robust and has better
performance of more than 90 % acceptance ratio.
Key words: Face recognition-Principal Component Analysis- Back Propagation Neura
l Network -
Acceptance ratio–Execution time
1. INTRODUCTION
A face recognition system [6] is a computer vision and it automatically identif
ies a human face
from database images. The face recognition problem is challenging as it needs to
account for all
possible appearance variation caused by change in illumination, facial features,
occlusions, etc.
This paper gives a Neural and PCA based algorithm for efficient and robust face
recognition.
Holistic approach, feature-based approach and hybrid approach are some of the ap
proaches for
face recognition. Here, a holistic approach is used in which the whole face regi
on is taken into
account as input data. This is based on principal component-analysis (PCA) techn
ique, which is
used to simplify a dataset into lower dimension while retaining the characterist
ics of dataset.
Pre-processing, Principal component analysis and Back Propagation Neural Algorit
hm
are the major implementations of this paper. Pre-processing is done for two purp
oses
(i)
To reduce noise and possible convolute effects of interfering system,
(ii)
To transform the image into a different space where classification may prove
easier by exploitation of certain features.
PCA is a common statistical technique for finding the patterns in high dimension
al data’s [1].
Feature extraction, also called Dimensionality Reduction, is done by PCA for a t
hree main
purposes like
i)
To reduce dimension of the data to more tractable limits
P.Latha, Dr.L.Ganesan & Dr.S.Annadurai
Ψ
−
Γ
=
Φ
i
i
(2)
The co variance matrix is formed by
T
M
n
T
n
n
A
A
M
C
.
.
1
1
=
Φ
Φ
=
∑
=
(3)
where the matrix
].
,.....,
,
[
2
1
M
A
Φ
Φ
Φ
=
This set of large vectors is then subject to principal component analysis, which
seeks a
set of M orthonormal vectors
M
u
u ....
1
.To obtain a weight vector Ω of contributions of
individual eigen faces to a facial image Γ, the face image is transformed into its
eigen face
components projected onto the face space by a simple operation
)
(
Ψ
−
Γ
=
T
k
k
u
ω
-----------(4)
For k=1,.., M', here M' ≤ M is the number of eigen-faces used for the recognition
. The eights
form vector Ω = [
'
,......,
,
2
1
M
ω
ω
ω
] that describes the contribution of each Eigen-face in
representing the face image Γ, treating the eigen-faces as a basis set for face im
ages.The
simplest method for determining hich face provides the best description of an u
nknon input
facial image is to find the image k that minimizes the Euclidean distance
k
ε
.
=
k ε
||
)
(
k
Ω
−
Ω
||2 ------------(5)
whr
k
Ω is a weight vector describing the kth face from the training set. A face is cla
ssified as
belonging to person k when the ‘
k
ε
‘is blow som chosn thrshold
ε Θ othrwis, th fac is
classifid as unknown.
Th algorithm functions by projcting fac imags onto a fatur spac that span
s th
significant variations among known fac imags. Th projction opration charact
rizs an
individual fac by a wightd sum of ignfacs faturs, so to rcogniz a part
icular fac, it is
ncssary only to compar ths wights to thos of known individuals. Th input
imag is
matchd to th subjct from th training st whos fatur vctor is th closst
within accptabl
thrsholds.
Eign facs hav advantags ovr th othr tchniqus availabl, such as
sp d and
fficincy. For th systm to work wll in PCA, th facs must b sn from a f
rontal viw undr
similar lighting.
3. NEURAL NETWORKS AND BACK PROPAGATION ALGORITHM
A succssful fac rcognition mthodology dpnds havily on th particular choi
c of th
faturs usd by th pattrn classifir .Th Back-Propagation is th bst known
and widly usd
larning algorithm in training multilayr prcptrons (MLP) [5]. Th MLP rfr t
o th ntwork
consisting of a st of snsory units (sourc nods) that constitut th input la
yr, on or mor
hiddn layrs of computation nods, and an output layr of computation nods. Th
input signal
propagats through th ntwork in a forward dirction, from lft to right and on
a layr-by-layr
basis.
Back propagation is a multi-layr fd forward, suprvisd larning ntwork bas
d on gradint
dscnt larning rul. This BPNN provids a computationally fficint mthod for
changing th
w ights in fd forward ntwork, with diffrntiabl activation function units,
to larn a training st
P.Latha, Dr.L.Gansan & Dr.S.Annadurai
Sign l Processing: An Intern tion l Journ l (SPIJ) Volume (3) : Issue (5)
157
whereα is tr ining r te coefficient th t is restricte to the r nge [0.01,1.0],
h jj is the output
of
neuron j in the hi en l yer, n δi c n be obt ine by
(
) (
)
i
i
i
i
i
o
l
o
o
t
−
−
=
δ
-----------
(12)
Simil rly, the ch nge of the weights between hi en l yer n output l yer, is g
iven by
j
Hi
ij
x
w
βδ
=
∆
-----------
(13)
where β is a training rate coefficient that is restricte to the range [0.01,1.0]
, xj is the output of
neuron j in the input layer, an δHi can e o taine y
(
)
ij
k
j
j
i
i
Hi
w
x
l
x
∑
=
−
=
1
δ
δ
(14)
xi is the output at neuron i in the input layer, an summation term represents t
he weighte sum of
all δj values correspon ing to neurons in output layer that o taine in equation.
After calculating
the weight change in all layers, the weights can simply up ate y
(
)
(
)
ij
ij
ij
w
ol
w
new
w
∆
+
=
----------- (15)
This process is repe te , until the error re ches minimum v lue
2.4.3 Selection of Tr ining P r meters
For the efficient oper tion of the b ck prop g tion network it is necess ry for
the ppropri te
selection of the p r meters use for tr ining.
Initi l Weights
This initi l weightwill influence whether the net re ches glob l or loc l min
im of the error n
if so howr pi ly it converges. To get the best result the initi l weights re s
et to r n om numbers
between -1 n 1.
Tr ining Net
The motiv tion for pplying b ck prop g tion net is to chieve b l nce between
memoriz
tion
n gener liz tion; it is not necess rily v nt geous to continue tr ining unti
l the error re ches
minimum v lue. The weight justments re b se on the tr ining p tterns. As
long s error
the for v li tion ecre ses tr ining continues. Whenever the error begins to in
cre se, the net is
st rting to memorize the tr ining p tterns. At this point tr ining is termin te
.
Number of Hi en Units
If the ctiv tion function c n v ry with the function, then it c n be seen th t
n-input, m-
output function requires t most 2n+1 hi en units. If more number of hi en l y
ers re present,
then the c lcul tion for the δ’s re repe te for e ch ition l hi en l yer prese
nt, summing ll
the δ’s for units present in the previous l yer th t is fe into the current l yer f
or which δ is being
c lcul te .
Le rning r te
In BPN, the weight ch nge is in irection th t is combin tion of current gr
ient n the
previous
gr ient. A sm ll le rning r te is use to voi m jor isruption of th
e irection of
le rning when very unusu l p ir of tr ining p tterns is presente .
V rious p r meters ssume for this lgorithm re s follows.
Sign l Processing: An Intern tion l Journ l (SPIJ) Volume (3) : Issue (5)
158
In this p per for experiment tion, 200 im ges from Y le t b se re t ken n
s mple of 20
f ce im ges is s shown in fig 3. One of the im ges s shown in fig 4 is t ken
s the Input
im ge.
The me n im ge n reconstructe output im ge by PCA, is s shown in fig
4b n 4c.
BPNN, tr
In ining set of 50 im ges is s shown in fig 5 n the Eigen f ces
n recognize
output im ge re s shown in fig 5b n 5c.
4( ) 4(b)
4 (c)
Fig 4.( ) Input Im ge , (b)Me n Im ge , (c) Recognize Im ge by PCA metho
5( ) 5(b)
5(c)
Fig 5 ( ) Tr ining set, (b) Eigen f ces , (c) Recognize Im ge by BPNN metho
T ble 1 shows the comp rison of ccept nce r tio n execution time v lues for 4
0, 80,
120,160 n 200 im ges of Y le t b se. Gr phic l n lysis of the s me is s s
hown in fig 6.
No .of
Accept nce r tio (%)
Execution Time (Secon s)
P.L th , Dr.L.G nes n & Dr.S.Ann ur i
Sign l Processing: An Intern tion l Journ l (SPIJ) Volume (3) : Issue (5)
159
Im ges
PCA
PCA with BPNN
PCA
PCA with BPNN
40
92.4
96.5
38
36
60
90.6
94.3
46
43
120
87.9
92.8
55
50
160
85.7
90.2
67
58
200
83.5
87.1
74
67
T ble 1 Comp rison of ccept nce r tio n execution time for Y le t b se im g
es
Fig.6: comp rison of Accept nce r tio n execution ti
me
5. CONCLUSION
F ce recognition h s receive subst nti l ttention from rese rches in biometric
s, p ttern
recognition fiel n computer vision communities. In this p per, F ce recognit
ion using Eigen
f ces h s been shown to be ccur te n f st. When BPNN technique is combine w
ith PCA,
non line r f ce im ges c n be recognize e sily. Hence it is conclu e th t thi
s metho h s the
ccept nce r tio is more th n 90 % n execution time of only few secon s. F ce
recognition
c n be pplie in Security me sure t Air ports, P ssport verific tion, Crimin l
s list verific tion in
police
ep rtment, Vis processing , Verific tion of Elector l i entific tion n
C r Security
me sure t ATM’s..
6. REFERENCES
[1]. B.K.Gunturk,A.U.B tur, n Y.Altunb s k,(2003) “Eigenf ce- om in super-resolu
tion for
f ce recognition,” IEEE Tr ns ctions of . Im ge Processing. vol.12, no.5.pp. 597-6
06.
Comp rision of Execution Time
0
10
20
30
40
50
60
70
80
40
60
120
160
200
No of im ges
Execution Time(sec)
PCA
PCA with BPNN
Comp rision of Accept nce R tio
75
80
85
90
95
100
40
60
120 160
200
No of Im ges
Accept nce R tio(%)
PCA
PCA with BPNN
P.L th , Dr.L.G nes n & Dr.S.Ann ur i
Sign l Processing: An Intern tion l Journ l (SPIJ) Volume (3) : Issue (5)
160
[2]. M.A.Turk n A.P.Petl n , (1991) “Eigenf ces for Recognition,” Journ l of Cogni
tive
Neuroscience. vol. 3, pp.71-86.
[3]. T.Y h gi n H.T k no,(1994) “F ce Recognition using neur l networks with mul
tiple
combin tions of c tegories,” Intern tion l Journ l of Electronics Inform tion n
Communic tion Engineering., vol.J77-D-II,
no.11,
pp.2151-2159.
[4]. S.L wrence, C.L.Giles, A.C.Tsoi, n A. .B ck, (1993) “IEEE Tr ns ctions of N
eur l
Networks. vol.8, no.1, pp.98-113.
[5]. C.M.Bishop,(1995) “Neur l Networks for P ttern Recognition” Lon on, U.K.:Oxfor
University Press.
[6]. K il sh J. K r n e S nj y N. T lb r “In epen ent Component A
n lysis of E ge
Inform tion for F ce Recognition” Intern tion l Journ l of Im ge Proce
ssing Volume (3) :
Issue (3) pp: 120 -131.