Sie sind auf Seite 1von 5

A Comparison of Various Illumination Normalization Techniques

Illumination variation causes dramatic changes in the face appearance. The varying direction and energy distribution of the ambient illumination, together with the 3D structure of the human face, can lead to major differences in the shading and shadows on the face [52-59]. Various illumination normalization methods have been used for compensating for illumination variations. For preprocessing, Histogram Equalization (HE) can be applied. But, this process only does the contrast enhancement. To suppress illumination variation, illumination normalization methods have been used. These methods have been applied below in detail. 1 DCT with Rescaling down low frequency coefficients
Frequency DCT is a transformation technique which transforms signal from its spatial transformation into a frequency representation. Low frequency components of DCT correspond to illumination variation in the digital image. So by rescaling down these low frequency components illumination variation can be compensated [60]. In this technique, initially histogram equalization has been applied on the face image for the contrast stretching. After applying histogram equalization, DCT has been applied on the resultant images. 2D DCT of an image is defined as M -N - 1 1 ( x) u p 2 1 + C uv , ) u ( ) ( ) f x ( , ) o = a a v y s .c ( 2M x =y = 0 0 ( y) u p 2 1 + cs o 2N The inverse DCT is defined as
f xy ) (, cs o =( ) uv ) o u ) ( ,a a ( vC .c s
u= = v 0 0 M - 1- 1 N

( x+ p 2 1 u ) 2M

( y+ p 2 1 u ) 2 N

Since the low DCT coefficients correspond to low frequency components, rescaling down of these coefficients can suppress illumination variation. The value of scaling down factor and the number of low frequency components which are to be rescaled are calculated experimentally. The DC coefficients, which is located at the top of the upper left corner, holds most of the energy the proportional average of the image. So this coefficient has not been selected for rescaling down. It has been enhanced which improves the visual appearance of the processed image. The other low frequency coefficients have been divided by RDF (Rescaling Down Factor) to eliminate the effect of illumination variations.

2. DCT in Logarithm Domain by Suppressing Low Coefficients


In this approach, discrete cosine transform has been performed on the logarithmic domain for compensating illumination variations. In this method, an appropriate number of DCT coefficients are truncated to eliminate the variations under different lighting conditions. Logarithm transformation has been done to expand the values of dark pixels for image enhancement]. An image is the product of two components, reflectance r(x,y) and illumination e(x,y).

f ( X , Y ) =e( X , Y ).r ( x, y )
Now, by taking log on both side of equation

log f ( x, y ) =log e( x, y ) +log r ( x, y ) (8) Now if the e(x,y) is the incident illumination and e(x,y) is the desired illumination, then log f(x,y)=log r(x,y)+log e = log r(x,y)+log e(x,y)- (x,y) =log f(x,y)- (x,y)
Where, (x,y)=loge(x,y)-(x,y) (x,y) is called the compensation term. An illuminated compensated image can be expected by subtracting the product of DCT basis image and the low frequency DCT coefficients. Then the resulting system will work like a High Pass Filter (HPF). If n low frequency DCT coefficients are set to zero, we have

f '( x, y ) = C (u, v) - C (u i , vi )
u = v= 0 0 i= 1

M -1 N -1

= f ( x, y ) - C (ui , vi )
i= 1

C (u , v )
i= 1 i i

Can be regarded as illumination compensation term. So f(x,y)

in equation (12) is just the desired normalized face image in the logarithm domain. So discarding low frequency DCT coefficients in logarithmic domain is equivalent to compensating for illumination variations.

3 Wavelet Analysis
Wavelet transform enhances the contrast as well as edges of face image simultaneously in frequency domain [65]. Wavelet transform is a representation of a signal in terms of a set of basis functions, which is obtained by dilation and translation of a basis wavelet. In the two band multi resolution wavelet transform, signal can be expressed by wavelet and scaling basis functions at different scale,

f ( x, y ) = a0,kf 0, k ( x) + d j ,k Yj ,k ( x)
k j k

j,k are scaling functions at scale j and j.k are wavelet functions at scale j, aj,k and dj,k are scaling coefficients and wavelets coefficients. After wavelet transform, an image is divided into four sub-bands which are LL, which is generated by the approximation coefficient, LH, Hl, and HH are generated by detailed coefficients. First, by applying histogram equalization on approximate coefficient, contrast enhancement has been done In second step, multiply the detail coefficient with a scalar (>1), edge enhancement has been done. Now a normalized image is obtained by doing inverse wavelet transform on the modified approximation and detailed coefficients. A simple block diagram explaining this process has been given in fig (10).

Wavelet Transform

Approximate Coefficient Modification Detailed Coefficient Modification Reconstruction

Fig.10. Block diagram representation of wavelet analysis for illumination normalization

4 Homomorphic Filtering
Homomorphic Filtering basically acts as a high pass filter used for illumination normalization. Homomorphic Filtering has been applied on the logarithmic domain, in which an image is composed of two components illumination and reflectance as discussed before. After applying homomorphic filtering, the illumination of the processed image will be drastically reduced because of high pass filtering effect. The reflectance of the image after homomorphic filtering still be very close to its original. So the image after homomorphic filtering will be normalized face image. Homomorphic Filtering algorithm has been performed as follows after taking logarithm of face image. The homomorphic filter used here is Butterworth filter whose transfer function is given by

H (u , v) =

D0 2 n 1 + D(u , v)
1/2

Where n is the order of the filter and D0 is the distance from the centre. D(u,v) is defined as

D(u , v) = u - M / 2) 2 +(v - M / 2)2 (

Where M and N are the rows and columns of the face image. The whole process of homomorphic filtering is described in fig.3.

I(x,y)

Log

H(u,v)

-1

exp

I(x,y)

Fig.3. Process of Homomorphic Filtering

5 Gamma Intensity Correction


Gamma correction has become increasingly important in the past few years, as the use of the digital images for commercial purpose has increased. Gamma correction produces an output that is close in appearance to the original image [54]. Gamma correction method corrects the overall brightness of the image to a predefined canonical face images. It is formulated as follows: Predefine a canonical image I0, which has been lighted under some normal lighting condition. The given face I, captured under some unknown lighting condition. Its canonical image is computed by a Gamma transform pixel by pixel over the image position x, y:

I ' =G ( I xy ; g )

Where, the Gamma coefficient * is computed by the following optimization process, which aims at minimizing the difference between the transformed image and the predefined normal face image I0

Where Ixy is the gray-level of the image position x,y; and

G ( I xy ; g ) =c.I

1 g xy

is the Gamma transform; c is a gray stretch parameter, and is the Gamma coefficient. Thus, from the above equations, gamma has the effect of adjusting overall brightness of all the face images to the same level as that of the common normal face I.

Comparison
A comparison of various illumination normalization techniques has been given in Table 1. This comparison has been performed on Yale B Face Database. This database contains 10 subjects which contains 5760 single light source images under 576 different viewing conditions. Comparison shows that DCT with downscaling low frequency coefficients yields the best recognition rate than other techniques. In this method the system acts as a high pass filter. This technique preserves the features of the face image also. So for better performance of face recognition system, we can use this technique for illumination normalization.

Technique Applied
No Enhancement Histogram Equalization Wavelet Analysis Gamma Intensity Correction Logarithmic Transformation Homomorphic Filtering DCT discarding low frequency components DCT Rescaling Low frequency Components

Subset 3
3.333 0 0 0 1.6667 0 0 0

Subset 4
47.1429 31.42 38.5714 27.12 25 26.42 25.743 25.743

Subset 5
84.7368 67.3684 43.157 36.482 39.4737 37.368 17.222 11.185

Table 1 Comparison of various illumination normalization techniques on Yale B database

Single Scale Retinex

DCT with Rescaling Coeff.

DCT with discarding coeff

Das könnte Ihnen auch gefallen