Beruflich Dokumente
Kultur Dokumente
ABSTRACT
Image enhancement has been an area of active research for decades. Most of the studies are
aimed at improving the quality of image for better visualization. Contrast Limited Adaptive
Histogram Equalization (CLAHE) is a technique to enhance the visibility of local details of an image
by increasing the contrast of local regions. The algorithm is extensively used by various researches
for applications in medical imagery. The drawback of CLAHE algorithm is the fact that it is not
automatic and needs two input parameters viz., N size of the sub window and CL the clip limit for
the method to work. Unfortunately none of the researchers have done the automatic selection of N
and CL to make the algorithm suitable for any autonomous system. This paper proposes a novel
extension of the conventional CLAHE algorithm, where N and CL are calculated automatically from
the given image data itself thereby making the algorithm fully adaptive. Our proposed algorithm is
used to study the enhancement of aerial, medical and underwater images. To demonstrate the
effectiveness of our algorithm, a set of quality metric parameters are used. In the conventional
CLAHE algorithm, we vary the value of N and CL and use the quality metric parameters to obtain
the best output for a given combination of N and CL. It is observed that for a given set input images,
the best results obtained using conventional CLAHE algorithm exactly matches with the results
obtained using our algorithm, where N and CL are calculated automatically.
32
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
I. INTRODUCTION
Histogram processing methods are global processing, in the sense that pixels are modified by
a transformation function based on the gray-level content of the entire image. An example of this is
Histogram Equalization. A local enhancement algorithm acts on local regions within an image. The
mapping applied on each pixel in the input image is decided upon by some property of the
neighborhood of that pixel. The methods vary from each other depending on the property chosen and
in the form in which it appears in the mapping. In such methods the size of the neighborhood or the
window size can be varied. Many enhancement algorithms require the user to choose some input
parameter(s) for enhancement. The enhancement is said to be adaptive, if the algorithm chooses the
optimum parameter(s) depending on the properties of the input image.
Histogram equalization [5] is one of the well-known method for enhancing the contrast of
given images, making the result image have a uniform distribution of the gray levels. It flattens and
stretches the dynamic range of the images histogram and results in overall contrast improvement.
HE has been widely applied when the image needs enhancement however, it may significantly
change the brightness of an input image and cause problem in some applications where brightness
preservation is necessary. Since the HE is based on the whole information of input image to
implement, the local details with smaller probability would not be enhanced.
contrast at smaller scales is enhanced, while contrast at larger scales is reduced. When the image
region containing a pixel's neighbourhood is fairly homogeneous, its histogram will be strongly
peaked, and the transformation function will map a narrow range of pixel values to the whole range
of the result image. This causes AHE to over amplify small amounts of noise in largely
homogeneous regions of the image [1].
In this section, we describe two new proposed algorithms Adaptively Clipped Contrast
Limited Adaptive Histogram Equalization (ACCLAHE) and Fully Automatic Contrast Limited
Adaptive Histogram Equalization (Auto-CLAHE) in detail.
2.6 Equalize the above histogram to obtain the clipped HE mapping for the sub-image
3. For each pixel in the input image, do the following
3.1 If the pixel belongs to an internal region (IR), then
(a) Compute four weights, one for each of the four nearest sub-images, based on the
proximity of the pixel to the centers of the four nearest sub-images (nearer the center of
the sub-image, larger the weight ).
(b) Calculate the output mapping for the pixel as the weighted sum of the clipped HE
mappings for the four nearest sub-images using the weights computed above.
3.2 If the pixel belongs to an border region (BR), then
(a) Compute two weights, one for each of the two nearest sub-images, based on the
proximity of the pixel to the centers of the two nearest sub-images
(b) Calculate the output mapping for the pixel as the weighted sum of the clipped HE
mappings for the two nearest sub-images using the weights computed above.
3.3 If the pixel belongs to a corner region (CR), the output mapping for the pixel is the
clipped HE mapping for the sub-image that contains the pixel.
4. Apply the output mapping obtained to each of the pixels in the input image to obtain the image
enhanced by ACCLAHE.
Algorithm 2: AUTO-CLAHE
Input: Image file;
Output: AUTO-CLAHE Enhanced Image
STEPS:
1. For n=0 to n=12 store entropy[n] = 0.
2. For n = 2 to n = 12, divide the image into n x n matrix of sub-images and store the maximum
entropy of the 2n sub-images as entropy[n].
3. Set N to that value of n for which entropy[n] is maximum.
4. Divide the input image into an NxN matrix of sub-images
5. For each sub-image do the following:
5.1 Compute the histogram of the sub-image.
5.2 Compute the high peak value of the sub-image.
5.3 Calculate the nominal clipping level, P from 0 to high peak using the binary search
elaborated.
5.4 For each gray level bin in the histogram do the following
(a) If the histogram bin is greater than the nominal clip level P, clip the histogram to the
nominal clip level P
(b) Collect the number of pixels in the sub-image that caused the histogram bin to exceed
the nominal clip level.
5.5 Distribute the clipped pixels uniformly in all histogram bins to obtain the renormalized
clipped histogram.
35
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
5.6 Equalize the above histogram to obtain the clipped HE mapping for the sub-image
6. For each pixel in the input image, do the following
6.1 If the pixel belongs to an internal region (IR), then
(a) Compute four weights, one for each of the four nearest sub-images, based on the
proximity of the pixel to the centers of the four nearest sub-images (nearer the center of
the sub-image, larger the weight)
(b) Calculate the output mapping for the pixel as the weighted sum of the clipped HE
mappings for the four nearest sub-images using the weights computed above.
6.2 If the pixel belongs to an border region (BR), then
(a) Compute two weights, one for each of the two nearest sub-images, based on the
proximity of the pixel to the centers of the two nearest sub-images
(b) Calculate the output mapping for the pixel as the weighted sum of the clipped HE
mappings for the two nearest sub-images using the weights computed above.
6.3 If the pixel belongs to a corner region (CR), the output mapping for the pixel is the HE
mapping for the sub-image that contains the pixel.
7. Apply the output mapping obtained to each of the pixels in the input image to obtain the image
enhanced by Auto-CLAHE.
All the algorithms presented in this paper are implemented in the Windows 7 Microsoft
Visual Studio platform using VC++ language for programming. The aerial, medical and underwater
images are selected for study. A set of quality metric parameters such as Entropy [7], Global
Contrast (GC) [8], Spatial Frequency (SF) [9], Fitness Measure (FM) [10] and Absolute Mean
Brightness Error (AMBE) [11] used to measure the quality of the enhanced image with respect to the
original image. The formulas of quality parameters are given in Appendix (section VIII).
In Contrast Limited Adaptive Histogram Equalization (CLAHE), we have two input
parameters N and CL. The value of N is initially kept constant at N=4 and the value of CL is varied
from 50 to 750 in steps of 50. The experiment is repeated for the value of N=4, 8 and 12. Analysis of
the results show that for a given value of N as we increase the value of CL, after a certain value of
CL, all quality metric parameters reaches to saturation and remains constant throughout the scale as
shown in Table I,II,III and sample graphs shown for Global Contrast in Fig 13. The saturation value
of clip limit for a given image is not the same for all the values of N. It is seen that as we increase the
value of N, the optimum value of clip limit that gives the best enhancement result decreases as
evident from the Table IV. The reason may be attributed as follows: As we increase the value of N,
the size of the sub-image decreases. This implies a decrease in the number of pixels in the sub-image
and thus a lowering of the maximum bin height in the local histogram.
In the proposed Adaptively Clipped Contrast Limited Adaptive Histogram Equalization
(ACCLAHE) method, N is given as manual input and CL is estimated automatically. The value of N
is varied from 2 to 12 in steps of 2. It is seen from the Table V that the value of all quality
parameters increases initially and subsequently decreases after a certain value of N as shown for
Entropy and Fitness measure in Figs 14-15. The point where the quality parameters reaches
maximum value matches exactly with the saturation value obtained in CLAHE. This fact is observed
for all images used in our experiment.
In the proposed Fully Automatic Contrast Limited Adaptive Histogram Equalization
(Auto-CLAHE) method, the values of N and CL are estimated automatically. The effects of quality
metric parameters on the output image after enhancement are studied. It is seen that the saturation
value of CLAHE and ACCLAHE exactly matches with the results obtained using Auto-CLAHE as
shown in Table VI.
36
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
37
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
Fig. 2: Results of CLAHE Image 1 for (a) N=4, CL=100 (b) N=4, CL=200 (c) N=4, CL=300
(d) N=4, CL=400 (e) N=4, CL=500
(f) ) N=8, CL=100 (g) N=8, CL=200 (h) N=8, CL=300 (i) N=8, CL=400 (j) N=8, CL=500
(k) ) N=12, CL=100 (l) N=12, CL=200 (m) N=12, CL=300 (n) N=12, CL=400 (o) N=12, CL=500
Fig. 3: Results of ACCLAHE Image 1 for (a) N=2, (b) N=4, (c) N=8, (d) N=10, (e) N=12
38
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
39
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
Fig. 5: Results of CLAHE Image 2 for (a) N=4, CL=100 (b) N=4, CL=200 (c) N=4, CL=300
(d) N=4, CL=400 (e) N=4, CL=500
(f) ) N=8, CL=100 (g) N=8, CL=200 (h) N=8, CL=300 (i) N=8, CL=400 (j) N=8, CL=500
(k) ) N=12, CL=100 (l) N=12, CL=200 (m) N=12, CL=300 (n) N=12, CL=400 (o) N=12, CL=500
Fig. 6: Results of ACCLAHE Image 2 for (a) N=2, (b) N=4, (c) N=8, (d) N=10, (e) N=12
40
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
Fig. 8: Results of CLAHE Image 3 for (a) N=4, CL=100 (b) N=4, CL=200 (c) N=4, CL=300
(d) N=4, CL=400 (e) N=4, CL=500
(f) ) N=8, CL=100 (g) N=8, CL=200 (h) N=8, CL=300 (i) N=8, CL=400 (j) N=8, CL=500
(k) ) N=12, CL=100 (l) N=12, CL=200 (m) N=12, CL=300 (n) N=12, CL=400 (o) N=12, CL=500
FIG. 9: RESULTS OF ACCLAHE IMAGE 3 FOR (A) N=2, (B) N=4, (C) N=8, (D) N=10, (E) N=12
41
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
42
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
N=2 7.735 4960.00 43.308 21.001 25.068 7.707 5131.02 49.538 21.696 41.857 7.951 5047.09 38.682 22.042 45.920
N=4 7.897 4505.35 44.792 21.799 24.464 7.887 4843.81 62.315 22.362 40.511 7.945 4570.98 34.695 22.110 44.355
N=6 7.831 4009.16 44.513 21.630 22.799 7.933 4745.58 62.499 22.500 39.758 7.889 4010.07 41.355 22.010 42.161
N=8 7.749 3417.34 43.865 21.502 21.952 7.917 4418.54 61.172 22.451 36.916 7.824 3538.79 41.424 21.838 37.713
N=10 7.636 2861.22 41.431 21.210 19.482 7.879 4104.29 58.612 22.325 34.592 7.745 3083.33 40.419 21.611 33.982
N=12 7.522 2380.94 39.055 20.881 18.203 7.831 3726.32 55.031 22.167 31.572 7.664 2688.36 39.415 21.387 30.112
Original
4.898 346.85 9.458 12.554 - 6.450 1752.12 15.719 17.471 - 7.230 1465.29 16.694 19.642 -
Image
Saturation
value of 7.897 4505.35 44.792 21.799 24.464 7.887 4843.81 62.315 22.362 40.511 7.951 5047.09 34.682 22.042 45.920
CLAHE
ACCLAHE
7.897 4505.35 44.792 21.799 24.464 7.887 4843.81 62.315 22.362 40.511 7.951 5047.09 34.682 22.042 45.920
Image
Auto-
CLAHE 7.897 4505.35 44.792 21.799 24.464 7.887 4843.81 62.315 22.362 40.511 7.951 5047.09 34.682 22.042 45.920
Image
43
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
V. CONCLUSION
The aim of our research is to make the algorithm automatic and adaptive with no manual
input. The value of N and CL are estimated automatically from the given image data, thereby making
the algorithm applicable in any autonomous system. In the existing CLAHE, it is observed that for a
given value of N as we increase the value of CL, we get the values of all quality metric parameters
which remain constant for further change in the value of CL. We have termed this as Saturation
Value.
In the proposed ACCLAHE and Auto-CLAHE, we get a set of quality metric parameters for
a given input image which exactly matches with the saturation values obtained in CLAHE. We
have also analyzed the methodology used to evaluate the algorithms performance, highlighting the
works where a quantitative quality metric has been used.
44
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
VI. ACKNOWLEDGMENTS
The authors express their sincere gratitude to Prof. N R Shetty, Director, Nitte Meenakshi
Institute of Technology and Dr. H C Nagaraj, Principal, Nitte Meenakshi Institute of Technology for
providing encouragement, support and the infrastructure to carry out the research.
VII. REFERENCES
[1] Stephen M. Pizer, E. Philip Amburn, John D. Austin, Robert Cromartie, Adaptive Histogram
Equalization and Its Variations, Computer Vision, Graphics, And Image Processing 39, 355-
368 (1987)
[2] K. Zuiderveld, Contrast Limited Adaptive Histogram Equalization, Academic Press Inc.,
(1994).
[3] Kashif Iqbal, Rosalina Abdul Salam, azam Osman and Abdullah Zawawi Talib, Underwater
Image Enhancement Using an Integrated Colour Model, IAENG International Journal of
Computer Science, 34:2, IJCS_34_2_12, 2007
[4] Balvant Singh, Ravi Shankar Mistra, Puran Gour, Analysis of Contrast Enhancement
Techniques For Underwater Image, IJCTEE Volume 1, Issue 2, 2009
[5] Rajesh Garg, Bhawna Mittal, sheetal garg, Histogram Equalization Techniques For Image
Enhancement, International Journal of Electronics & Communication Technology, Volume 2,
Issue 1, March 2011
[6] Ramyashree N, Pathra P, Shruthi T V, Dr. JharnaMajumdar, Enhacement of Aerial and
Medical Image using Multi resolution pyramid, Special Issue of IJCCT Vol. 1 Issue 2,3,4;
International Conferecnce - ACCTA-2010
[7] Zhengmao Ye, Objective Assessment of Nonlinear Segmentation Approaches to Gray Level
Underwater Images, ICGST-GVIP Journal, ISSN 1687-398X, Volume (9), Issue (II), April
2009
[8] Jia-Guu Leu, Image Contrast Enhancement Based on the Intensities of Edge Pixels,
CVGIP: Graphical Models And Image Processing Vol. 54, No. 6, November, pp. 497-506,
1992.
[9] Sonja Grgi c Mislav Grgic Marta Mrak, Reliability of Objective Picture Quality Measures,
Journal of ELECTRICAL ENGINEERING, VOL. 55, NO. 1-2, 2004, 3-10.
[10] Munteanu C and Rosa A, Gray-Scale Image Enhancement as an Automatic Process Driven
by Evolution, IEEE Transactions on Systems, Man, and CyberneticsPart B: Cybernetics,
Vol. 34, No. 2, April 2004.
[11] Iyad Jafar Hao Ying, A New Method for Image Contrast Enhancement Based on Automatic
Specification of Local Histograms, IJCSNS International Journal of Computer Science and
Network Security, VOL.7 No.7, July 2007.
[12] Raimondo Schettini and Silvia Corchs, Review Article - Underwater Image Processing :
State of the Art of restoration and Image Enhancement Methods, EURASIP Journal on
Advances in Signal Processing, Volume 2010.
[13] Rajesh Kumar Rai, Puran Gour, Balvant Singh, Underwater Image Segmentation using
CLAHE Enhancement and Thresholding, International Journal of Emerging Technology and
Advanced Engineering, ISSN 2250-2459, Volume 2, Issue 1, January 2012
[14] Wan Nural Jawahir Hj Wan Yussof, Muhammad Suzuri Hitam, Ezmahamrul Afreen
Awalludin, and Zainuddin Bachok, Performing Contrast Limited Adaptive Histogram
Equalization Technique on Combined Color Models for Underwater Image Enhancement,
International Journal of Interactive Digital Media, Vol. 1(1), ISSN 2289-4098, e-ISSN 2289-
4101- 2013
45
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
A. ENTROPY:
The entropy [7] also called discrete entropy is a measure of information content in an image
and is given by,
255
=k=0 p(k)log2(p(k))
Entropy
(1)
Where p(k) is the probability distribution function. Larger the entropy, larger is the information
contained in the image and hence more details are visible in the image.
GC=
(i ) *hist(i)
i=0
2
N (2)
Where, is the average intensity of the image, hist(i) is the number of pixels in the image with the
intensity value i and L is the highest intensity value.
C. SPATIAL FREQUENCY(SF):
The Spatial Frequency [9] indicates the overall activity level in an image. SF is defined as
follows:
SF= R2 +C2 (3)
2
1 MN
R= (xj,k xj,k1)
MNj=1 k=2
(4)
2
1 M N
(5)
C=
M
( x j ,k x j 1, k )
k =1 j = 2
46
International Journal of Computer Engineering and Technology (IJCET), ISSN 0976-6367(Print),
ISSN 0976 - 6375(Online), Volume 5, Issue 11, November (2014), pp. 32-47 IAEME
Where R is row frequency, C is column frequency and xj,k denotes the pixel intensity values of
image; M and N are numbers of pixels in horizontal and vertical directions.
D. FITNESS MEASURE:
The Fitness measure [10] depends on the entropy H(I), no. of edges n(I) and the intensity of
edges E(I).
n(I)
sure= ln(lnE(I) +e)
FitnessMea H(I)
(width*height
) (6)
Compared to the original image, the enhanced version should have a higher intensity of the edges.
AMBE = p i
(7)
The AMBE value provides a sense of how the image global appearance has changed, with
preference to lower values.
47