Beruflich Dokumente
Kultur Dokumente
Research Article
New Brodatz-Based Image Databases for Grayscale Color and
Multiband Texture Analysis
Copyright © 2013 S. Abdelmounaime and H. Dong-Chen. This is an open access article distributed under the Creative Commons
Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is
properly cited.
Grayscale and color textures can have spectral informative content. This spectral information coexists with the grayscale or
chromatic spatial pattern that characterizes the texture. This informative and nontextural spectral content can be a source of
confusion for rigorous evaluations of the intrinsic textural performance of texture methods. In this paper, we used basic image
processing tools to develop a new class of textures in which texture information is the only source of discrimination. Spectral
information in this new class of textures contributes only to form texture. The textures are grouped into two databases. The first
is the Normalized Brodatz Texture database (NBT) which is a collection of grayscale images. The second is the Multiband Texture
(MBT) database which is a collection of color texture images. Thus, this new class of textures is ideal for rigorous comparisons
between texture analysis methods based only on their intrinsic performance on texture characterization.
32000
16000
24000
12000
Frequency
Frequency
8000 16000
4000 8000
0 0
0 50 100 150 200 255 0 50 100 150 200 255
Gray-level Gray-level
D32 D64
D28 D95
D10 D75
(a) (b)
20000 6000
15000 4500
Frequency
Frequency
10000 3000
5000 1500
0 0
0 50 100 150 200 255 0 50 100 150 200 255
Gray-level Gray-level
Normalized D32 Normalized D64
Normalized D28 Normalized D95
Normalized D10 Normalized D75
(a) (b)
Figure 5: Examples of color texture images from the VisTex 4. Multiband Texture
database.
This section presents the third category of texture referred
to as multiband texture. A good example of this category
is astronomical and remote sensing images. This section
channel and the 𝑖th row of the same texture feature image presents an analysis of the chromatic and textural content of
estimated in a different channel. This figure can be interpreted some of these images.
as correlation profiles along the row dimension of the texture
feature images.
As shown in Figure 8, an important correlation exists 4.1. Chromatic Content Analysis. Figure 10 presents three
between the texture information for the three channels of the images where the first two are astronomical images from the
Fabric.0001 texture. For both texture features (i.e., contrast Spitzer and Hubble NASA telescopes, and the third is from a
and dissimilarity), the maximum correlation was recorded remote sensing earth satellite. These images were taken with
between the R and G channels (𝑟 close to 1). For the six instruments having relatively high spatial resolutions (e.g.,
correlation profiles, the minimum recorded correlation was ∼1.8 m for WorldView-2). This produces contrasting bands
0.881, confirming that, just as for the chromatic content, the because it is possible to detect small details in the observed
texture content of the three Fabric.0001 channels was highly object. Indeed, as shown in Figure 11, the histograms of these
correlated. Similar results were obtained for other VisTex images covered the whole 256 gray-level range. This is in
images. This high similarity explains why methods using only contrast with images in Figure 6. On the other hand, the
one band to estimate texture are successful. wavelengths depicted in each of the three-color composites
are very different: for example, for the Tarantula Nebula
3.3. Colored Brodatz Texture Database. Given the strong image, emission at 775 nm is depicted in green and 0.826 nm
chromatic content and the high textural similarity of VisTex in blue. This produces RGB bands that are less correlated
images, it was possible to transform grayscale Brodatz images compared with images with closer wavelengths, such as
to color images similar to VisTex images. To do so, for each natural images taken in the visible spectrum domain. As
Brodatz image, two additional channels were generated to shown in Table 3, the correlation coefficients between their
form pseudo-RGB color texture images by a simple gray-level RGB channels are relatively small compared with those of
shift. This produced color Brodatz texture images (Figure 9) VisTex images given in Table 2 (e.g., 𝑟 = −0.108 and 𝑟 = 0.390
with richly textured content (the same as the original Brodatz between the red and blue bands of the WorldView-2 and
textures) and high informative color content similar to VisTex Tarantula Nebula images, resp.). These two characteristics
images. This process produced a gradient of colors (e.g., D44 related to the spatial resolution and the wavelength contribute
and D95) that gave these images a natural appearance, while to the formation of images with rich color content.
ISRN Machine Vision 5
16000 Wood.0002
8000 Fabric.0001
12000
6000
Frequency
Frequency
4000 8000
2000 4000
0 0
0 50 100 150 200 255 0 50 100 150 200 255
Gray-level Gray-level
R R
G G
B B
(a) (b)
10000 Water.0005
8000
Frequency
6000
4000
2000
0
0 50 100 150 200 255
Gray-level
R
G
B
(c)
1 1
0.8 0.8
Dissimilarity value
Contrast value
0.6 0.6
0.4 0.4
0.2 0.2
0 0
1 50 100 150 200 250 300 350 400 450 512 1 50 100 150 200 250 300 350 400 450 512
Pixels (row) Pixels (row)
R-G R-G
R-B R-B
G-B G-B
(a) (b)
Figure 8: Per-row correlation coefficient profile of the cooccurrence matrix features of the Fabric.0001 image.
Figure 9: Texture samples from the Colored Texture Brodatz (CBT) database.
colored texture category (Table 2). The textural content of two images possess different textural content, which was in
Galaxy IC-342 bands is less correlated than that of the contrast with the VisTex images (Figure 8).
WorldView-2 image. The textural content of some bands in Given the low informative value of the chromatic content
WorldView-2 showed a very small correlation. This was the of these images, and the low correlation between the textural
case, for example, for the dissimilarity of the red and blue content of their spectral bands, we can conclude that, for
channels (𝑟 = 0.087). these images, the most discriminative information is texture.
For a detailed analysis of the textural content of images This includes intraband and interband texture information. It
in Figure 13, per-row correlation coefficient (𝑟) profiles were would be useful to have a standard database in which images
also estimated as described in Section 3. Except for two have the same characteristics as those studied in this section.
correlation profiles estimated between the red and the green A standard database would be useful for the validation of
bands of the WorldView-2 image, the ten other profiles methods focusing only on the texture of color images because
showed small correlations. This means that the bands of these the color information has low informative content.
ISRN Machine Vision 7
Table 4: Correlation coefficients for the textural content of the three channels of the images in Figure 10.
4500 4500
Frequency
Frequency
3000 3000
1500 1500
0 0
1 50 100 150 200 255 1 50 100 150 200 255
Gray-level Gray-level
R R
G G
B B
(a) (b)
2000 Tarantula-Nebula
1500
Frequency
1000
500
0
1 50 100 150 200 255
Gray-level
R
G
B
(c)
Figure 11: RGB histograms of the three texture images in Figure 10.
Based on our results, multiband texture can be defined 6. Experiments on MBT Database
by extending the definition proposed by Haralick et al. [7]
for grayscale texture: the texture of color (or, in general, Section 5 showed that images from the MBT database have
multiband) images is formed by the spatial distribution of the almost the same chromatic content and also have important
tonal variations in the same band plus the spatial distribution intraband and interband spatial variations. This has two
of the tonal variations across different bands. The first important implications. The first is that it is not possible
distribution in the proposed definition defines the grayscale to discriminate between MBT images using only chromatic
texture, defined by Haralick in [7]. The second one defines information. The second is that it is not possible to dis-
the part of texture resulting from interband spatial variation. criminate between MBT images using only intraband texture
These two types of distributions define multiband texture. characterization as it is usually the case for existing color
Both spatial distributions contribute in different amounts to texture databases. In this section we tested the validity of
form the whole texture of the color or multiband image. these two observations in the context of texture classification.
An important aspect of this definition is that it clearly For that, we carry out two independent classifications. The
identifies a certain part of color as an intrinsic part of first used only spectral information (RGB values) and the
the texture. Indeed, gray-level variation across the different second used only intraband texture information. We used a
bands produces color and, when this variation is the result mosaic of eight textures (Figure 16) from the MBT database.
of interband texture differences, it is identified as part of Figures 17(a) and 17(b) indicate the names of the MBT
texture. Consequently, this definition introduces a distinction images according to their relative positions in the mosaic
between this chromatic part of texture and the pure chromatic and the three textures from the Normalized Brodatz Texture
image content that does not possess texture values. (Section 2) that were used to generate each MBT image.
ISRN Machine Vision 9
2000
Frequency
2000
Frequency
1000
1000
0 0
∘
1 ∘
50 ∘
100 150 ∘
200 ∘ ∘
250 300 ∘ ∘
360 1∘ 50∘ 100∘ 150∘ 200 ∘ 250∘ 300∘ 360∘
Hue Hue
(a) (b)
60000 Fabric.0001 20000 Water.0005
15000
40000
Frequency
Frequency
10000
20000
5000
0 0
1 ∘
50 ∘
100 ∘
150 ∘
200 ∘
250 ∘
300 ∘
360 ∘
1∘ 50∘ 100∘ 150∘ 200 ∘ 250∘ 300∘ 360∘
Hue Hue
(c) (d)
Figure 12: Color content comparison between multiband textures (a and b) and VisTex color textures (c and d). The well-localized peaks in
Fabric.0001 and Water.0005 (bottom row) indicate the presence of a dominant color tone, whereas the flat histograms in Galaxy IC 342 and
WorldView-2 (top row) indicate a wide range of color content.
6.1. Spectral Classification. The mosaic in Figure 16 was of each wavelet subband [9] which has proven to be efficient
classified using several benchmark spectral-based algorithms in texture classification [9, 43, 44]:
including K-means [39], Isodata [39], maximum likelihood 1 2
[35], and mean-shift [40]. The results for all of the tested 𝐸 = 2 ∑𝑊(𝑖, 𝑗) , (1)
𝑁 R
algorithms showed that none of them were able to identify the
eight textures. For the first three classification algorithms, for where 𝐸 is the energy estimated using a neighborhood R of
example, the results showed that all eight textures were almost size 𝑁, 𝑊 is the wavelet coefficient, and (𝑖, 𝑗) gives the spatial
evenly distributed over the entire area of the mosaic. This position.
supported the observation in Section 5 related to the quasi- The resulting output for a transformed single band at
uniform distribution of the color content in MTB images. For a fixed level is one approximate subband and three detail
the mean-shift segmentation algorithm, the boundaries of the subbands (i.e., horizontal, vertical, and diagonal). For texture
detected regions were totally different from those of the eight analysis, only the detail subbands were used [45]. As a result,
textures. This result provides clear evidence that color texture the energy feature was estimated using only the three detail
images from the MBT database do not possess pure chromatic subbands separately. The choice of the wavelet function is a
informative content. The highest overall classification rate for crucial step in texture analysis [46]. It is beyond the scope
the spectral classifications was 12.5%. of this paper to present a detailed study of the effect of
wavelet function characteristics on multiband texture dis-
crimination. For the purposes of this study, different wavelet
6.2. Textural Extraction and Classification. Among the exist- functions were tested. The best combination of wavelet
ing texture analysis methods, we selected the wavelet trans- function and moving window for energy feature calculation
form [35]. This transform is a powerful technique for the anal- was the biorthogonal spline function and a moving window
ysis and decomposition of images at multiple resolutions and of 33 by 33. This was fixed empirically based on the overall
different frequencies [35]. This property makes it especially classification accuracy of the mosaic in Figure 16.
suitable for the segmentation and classification of texture We used one-level wavelet decomposition because more
[9, 41, 42]. We used as texture feature the local energy measure decomposition levels did not bring significant improvement.
10 ISRN Machine Vision
1 Galaxy IC 342 1
Galaxy IC 342
0.8 0.8
Dissimilarity value
Contrast value
0.6 0.6
0.4 0.4
0.2 0.2
0 0
1 50 100 150 200 250 300 350 400 450 500 550 600 640 1 50 100 150 200 250 300 350 400 450 500 550 600 640
Pixels (row) Pixels (row)
R-G R-G
R-B R-B
G-B G-B
(a) (b)
1 WorldView-2
1 WorldView-2
0.8
0.8
Dissimilarity value
Contrast value
0.6
0.6
0.4 0.4
0.2 0.2
0 0
−0.2 −0.2
1 50 100 150 200 250 300 350 400 450 500 550 600 640 1 50 100 150 200 250 300 350 400 450 500 550 600 640
Pixels (row) Pixels (row)
R-G R-G
R-B R-B
G-B G-B
(c) (d)
Figure 13: Per-row correlation coefficient profile of the cooccurrence matrix for the Galaxy IC-342 (a and b) and WorldView-2 (c and d)
images.
Consequently, the classification process used three texture Table 5: Classification results for the three texture analysis methods
feature images as input in the case of the intensity band and comparing one-band and three-band texture analysis.
nine in the case of the three-band analysis.
Intensity Three band
Many classification processes have been proposed by the
pattern recognition community in the literature. For our Overall classification rate 22.6 46.8
experiments, a simple minimum distance classifier scheme
based on the Euclidean distance was used in order to test
the discrimination power of texture features using a basic
classifier. Training samples of 50 by 50 pixels each were The classification results are summarized in Table 5. We
selected from the center of each of the eight textures in can notice that that the intensity component did not preserve
Figure 16 to serve as reference data. The size of 50 by 50 multiband texture. It only provided an overall classification
pixels was first determined by evaluating the accuracy of the rate of 22.6%. The three-band strategy provided better results
classifier with different sample sizes ranging from 10 × 10 to with an overall classification rate of 46.8%. This means that for
100 × 100 pixels. MBT images, the analysis of each band separately preserves
In color images, texture is usually extracted either in the better texture than the use of the intensity image component.
intensity image component [47–50] or in each of the three However, none of the two strategies was able to achieve
RGB bands separately [51–53]. Both of the two strategies were satisfactory classification rate of the MBT mosaic. This means
tested in two different classifications. The first classification that texture of this mosaic cannot be simplified into three
related to the first strategy used three texture bands (3 details independent texture plans or as one intensity component.
subbands). The second one used nine texture bands (3 details Texture of MBT should be analyzed as a whole by considering
bands × 3 spectral bands). the intraband and the interband spatial interactions.
ISRN Machine Vision 11
Figure 14: Texture image samples from the Multiband Texture (MBT) database.
2000 D28D92D111
D15D52D55
1500
Frequency
Frequency
1000 1000
500
0 0
∘ ∘ ∘ ∘ ∘ ∘ ∘ ∘
1 50 100 150 200 250 300 360 1∘ 50∘ 100∘ 150∘ 200 ∘ 250∘ 300∘ 360∘
Hue Hue
(a) (b)
1 D28D92D111 1 D28D92D111
0.8 0.8
Dissimilarity value
Contrast value
0.6 0.6
0.4 0.4
0.2 0.2
0 0
1 50 100 150 200 250 300 350 400 450 500 550 600 640 1 50 100 150 200 250 300 350 400 450 500 550 600 640
Pixels (row) Pixels (row)
R-G R-G
R-B R-B
G-B G-B
(c) (d)
Figure 15: Color and texture content of some images from the multiband texture (MBT) database.
12 ISRN Machine Vision
(a) (b)
Figure 17: (a) Names and position of the eight MBT images in
Figure 16. (b) Names of the three textures from the NBT database
(Section 2) used to form each texture MBT in (a).
[4] J. R. Smith, C. Y. Lin, and M. Naphade, “Video texture of texture analysis algorithms,” in Proceedings of the 16th
indexing using spatio-temporal wavelets,” in Proceedings of the International Conference on Pattern Recognition (ICPR ’02), vol.
International Conference on Image Processing (ICIP ’02), vol. 2, 1, pp. 701–706, August 2002.
pp. II/437–II/440, September 2002. [22] T. Ojala, M. Pietikainen, and T. Maenpaa, “Multiresolution
[5] W. Phillips III, M. Shah, and N. D. Lobo, “Flame recognition in gray-scale and rotation invariant texture classification with local
video,” Pattern Recognition Letters, vol. 23, no. 1–3, pp. 319–327, binary patterns,” IEEE Transactions on Pattern Analysis and
2002. Machine Intelligence, vol. 24, no. 7, pp. 971–987, 2002.
[6] R. C. Nelson and R. Polana, “Qualitative recognition of motion [23] P. Janney and Z. Yu, “Invariant features of local textures—a
using temporal texture,” CVGIP—Image Understanding, vol. 56, rotation invariant local texture descriptor,” in Proceedings of
no. 1, pp. 78–89, 1992. the IEEE Computer Society Conference on Computer Vision and
[7] R. M. Haralick, K. Shanmugam, and I. Dinstein, “Textural Pattern Recognition (CVPR ’07), June 2007.
features for image classification,” IEEE Transactions on Systems, [24] D. E. Ilea and P. F. Whelan, “Image segmentation based on
Man and Cybernetics, vol. 3, no. 6, pp. 610–621, 1973. the integration of colourtexture descriptors—a review,” Pattern
[8] D. C. He and L. Wang, “Texture unit, texture spectrum, and Recognition, vol. 44, no. 10-11, pp. 2479–2501, 2011.
texture analysis,” IEEE Transactions on Geoscience and Remote [25] Y. Deng and B. S. Manjunath, “Unsupervised segmentation of
Sensing, vol. 28, no. 4, pp. 509–512, 1990. color-texture regions in images and video,” IEEE Transactions
[9] A. Laine and J. Fan, “Texture classification by wavelet packet on Pattern Analysis and Machine Intelligence, vol. 23, no. 8, pp.
signatures,” IEEE Transactions on Pattern Analysis and Machine 800–810, 2001.
Intelligence, vol. 15, no. 11, pp. 1186–1191, 1993. [26] G. Paschos, “Perceptually uniform color spaces for color texture
[10] A. C. Bovik, M. Clark, and W. S. Geisler, “Multichannel texture analysis: an empirical evaluation,” IEEE Transactions on Image
analysis using localized spatial filters,” IEEE Transactions on Processing, vol. 10, no. 6, pp. 932–937, 2001.
Pattern Analysis and Machine Intelligence, vol. 12, no. 1, pp. 55– [27] F. Bianconi, A. Fernández, E. González, D. Caride, and
73, 1990. A. Calviño, “Rotation-invariant colour texture classification
[11] H. Xin, Z. Liangpei, and W. Le, “Evaluation of morphological through multilayer CCR,” Pattern Recognition Letters, vol. 30,
texture features for mangrove forest mapping and species no. 8, pp. 765–773, 2009.
discrimination using multispectral IKONOS imagery,” IEEE [28] K. J. Dana, B. van Ginneken, S. K. Nayar, and J. J. Koenderink,
Geoscience and Remote Sensing Letters, vol. 6, no. 3, pp. 393–397, “Reflectance and texture of real-world surfaces,” ACM Transac-
2009. tions on Graphics (TOG), vol. 18, no. 1, pp. 1–34, 1999.
[12] A. Voisin, V. A. Krylov, G. Moser, S. B. Serpico, and J. Zerubia, [29] L. Wang and D. He, “A new statistical approach for texture
“classification of very high resolution SAR images of urban areas analysis,” PE&RS, vol. 56, no. 1, pp. 61247–61266, 1990.
using copulas and texture in a hierarchical Markov random field [30] H. Weschsler and M. Kidode, “A random walk procedure for
model,” IEEE Geoscience and Remote Sensing Letters, vol. 10, no. texture discrimination,” IEEE Transactions on Pattern Analysis
1, pp. 96–100, 2013. and Machine Intelligence, vol. 1, no. 3, pp. 272–280, 1979.
[13] R. M. Haralick, “Performance characterization in computer [31] L. M. Kaplan, “Extended fractal analysis for texture classifica-
vision,” CVGIP—Image Understanding, vol. 60, no. 2, pp. 245– tion and segmentation,” IEEE Transactions on Image Processing,
249, 1994. vol. 8, no. 11, pp. 1572–1585, 1999.
[14] P. J. Phillips and K. W. Bowyer, “Empirical evaluation of [32] J. Melendez, M. A. Garcia, D. Puig, and M. Petrou, “Unsu-
computer vision algorithms,” IEEE Transactions on Pattern pervised texture-based image segmentation through pattern
Analysis and Machine Intelligence, vol. 21, no. 4, pp. 289–290, discovery,” Computer Vision and Image Understanding, vol. 115,
1999. no. 8, pp. 1121–1133, 2011.
[15] P. Brodatz, Testures: A Photographic Album for Artists & Design- [33] K. I. Kilic and R. H. Abiyev, “Exploiting the synergy between
ers, Dover, New York, NY, USA, 1966. fractal dimension and lacunarity for improved texture recogni-
[16] L. Liu and P. Fieguth, “Texture classification from random tion,” Signal Processing, vol. 91, no. 10, pp. 2332–2344, 2011.
features,” IEEE Transactions on Pattern Analysis and Machine [34] K. Zuiderveld, “Contrast limited adaptive histogram equaliza-
Intelligence, vol. 34, no. 3, pp. 574–586, 2012. tion,” in Graphics Gems IV, pp. 474–485, Morgan Kaufmann,
[17] J. Yang, Y. Zhuang, and F. Wu, “ESVC-based extraction and Burlington, Mass, USA, 1994.
segmentation of texture features,” Computers & Geosciences, vol. [35] Digital Image Processing, R. C. Gonzalez and R. E. Woods Eds.,
49, pp. 238–247, 2012. Prentice Hall, Upper Saddle River, NJ, USA, 2008.
[18] F. M. Khellah, “Texture classification using dominant neighbor- [36] A. Drimbarean and P. F. Whelan, “Experiments in colour
hood structure,” IEEE Transactions on Image Processing, vol. 20, texture analysis,” Pattern Recognition Letters, vol. 22, no. 10, pp.
no. 11, pp. 3270–3279, 2011. 1161–1167, 2001.
[19] B. M. Carvalho, T. S. Souza, and E. Garduno, “Texture fuzzy [37] A. Rosenfeld, C. Y. Wang, and A. Y. Wu, “Multispectral texture,”
segmentation using adaptive affinity functions,” in Proceedings IEEE Transactions on Systems, Man and Cybernetics, vol. 12, no.
of the 27th Annual ACM Symposium on Applied Computing, pp. 1, pp. 79–84, 1982.
51–53, Trento, Italy, March 2012. [38] A. Stolpmann and L. S. Dooley, “Genetic algorithms for
[20] I. J. Sumana, G. Lu, and D. Zhang, “Comparison of curvelet and automized feature selection in a texture classification system,”
wavelet texture features for content based image retrieval,” in in Proceedings of the 4th International Conference on Signal
Proceedings of the IEEE International Conference on Multimedia Processing Proceedings (ICSP ’98), vol. 2, pp. 1229–1232, October
and Expo (ICME ’12), pp. 290–295, July 2012. 1998.
[21] T. Ojala, T. Maenpaa, M. Pietikainen, J. Viertola, J. Kyllonen, and [39] J. T. Tou and R. C. Gonzalez, Pattern Recognition Principles, vol.
S. Huovinen, “Outex–new framework for empirical evaluation 7, Image Rochester, New York, NY, USA, 1974.
14 ISRN Machine Vision
Rotating
Machinery
International Journal of
The Scientific
Engineering Distributed
Journal of
Journal of
Journal of
Control Science
and Engineering
Advances in
Civil Engineering
Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014
Journal of
Journal of Electrical and Computer
Robotics
Hindawi Publishing Corporation
Engineering
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014
VLSI Design
Advances in
OptoElectronics
International Journal of
International Journal of
Modelling &
Simulation
Aerospace
Hindawi Publishing Corporation Volume 2014
Navigation and
Observation
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014
in Engineering
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014
Engineering
Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
http://www.hindawi.com Volume 2014
International Journal of
International Journal of Antennas and Active and Passive Advances in
Chemical Engineering Propagation Electronic Components Shock and Vibration Acoustics and Vibration
Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation
http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014