Beruflich Dokumente
Kultur Dokumente
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Lecture 6, lecture 7
Pattern Recognition
Chapter two
Lecture 6, lecture 7
Pattern Recognition
Homogeneity (HOM)
(also called
"Inverse Difference Moment")
Lecture 6, lecture 7
Pattern Recognition
Homogeneity equation =
Lecture 6, lecture 7
Pattern Recognition
Exercise:
Calculate the homogeneity value for the
horizontal GLCM and compare it with the
Dissimilarity value.
Homogeneity calculation:
Homogeneity weights X horizontal GLCM
Homogeneity equation =
Lecture 6, lecture 7
Pattern Recognition
Pij
(1+(i-j)2)-1
1
0.5
0.2
0.1
0.5
0.5
0.2
0.2
0.5
0.5
0.1
0.2
0.5
Lecture 6, lecture 7
Sayed
Fadel
Professor
Bahgat
Pattern Recognition
Pij
(1+(i-j)2)-1
Lecture 6, lecture 7
Pattern Recognition
0.083
0.042
0.083
.166
0.042
0.25
0.042
.042
0.083
Lecture 6, lecture 7
Pattern Recognition
Pij
(1+(i-j)2)-1
1
0.5
0.2
0.1
0.5
0.5
0.2
0.083
.166
0.2
0.5
0.5 X
0.042
0.25 0.042
0.1
0.2
0.5
.042 0.083
0.166
0.042
1
.008
Lecture 6, lecture 7
Pattern Recognition
Pij
(1+(i-j)2)-1
1
0.5
0.2
0.1
0.5
0.5
0.2
0.083
.166
0.2
0.5
0.5 X 0.042
0.25 0.042
0.1
0.2
0.5
.042 0.083
0.166
0.042
.008
0.042
0.166
0.008
0.25
0.021
0.021
0.083
Pattern Recognition
+ .083(1)1
= 0.166 + 0.042 + 0.008 + 0
+ 0.042 + 0.166 + 0
+ 0
+ 0.008 + 0
+ 0.250 + 0.021
+ 0
+ 0
+ 0.021 +0 .083
= .807
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Exercise:
Calculate the similarity value for the
horizontal GLCM.
Similarity calculation:
= "Similarity weights X horizontal GLCM
= multiplication results
Lecture 6, lecture 7
Pattern Recognition
Pij
(1+|i-j|)-1
Lecture 6, lecture 7
Pattern Recognition
Pij
(1+|i-j|)-1
1
0.5 0.333
0.5
0.333
0.5
0.25 0.333
0.5
0.25
0.5 0.333
0.5 X
1
Lecture 6, lecture 7
Pattern Recognition
0.083
0.042
0.083
.166
0.042
0.25
0.042
.042
0.083
Lecture 6, lecture 7
Pattern Recognition
Pij
(1+|i-j|)-1
1
0.5 0.333
0.5
0.333
0.5
0.25 0.333
0.5
0.166
0.25
0.5 0.333
0.042
0.5 X
1
.014
0.083
.166
0.042
0.25 0.042
.042 0.083
Lecture 6, lecture 7
Pattern Recognition
Pij
(1+|i-j|)-1
1
0.5 0.333
0.5
0.333
0.5
0.25 0.333
0.5
0.083
.166
0.042
0.25 0.042
.042 0.083
0.25
0.5 0.333
0.5 X
0.166
0.042
.014
0.042
0.166
0.014
0.25
0.021
0.021
0.083
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
ASM equation =
The square root of the ASM is sometimes
used as a texture measure, and is called
Energy.
Lecture 6, lecture 7
Pattern Recognition
0.083
0.042
0.083
.166
0.042
0.25
0.042
.042
0.083
Lecture 6, lecture 7
Pattern Recognition
0.007
0.002
0.007
0.028
0.002
0.0625
0.002
0.002
0.007
summed = 0 .145
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Entropy equation =
Lecture 6, lecture 7
Pattern Recognition
Entropy =
ln(Pij)* horizontal GLCM * (-1)
=
multiplication results
Lecture 6, lecture 7
Pattern Recognition
(Pi,j)
0.166
0.083
0.042
0.083
.166
0.042
0.25
0.042
.042
0.083
Lecture 6, lecture 7
Pattern Recognition
ln ( Pi,j)
-1.7957
-2.4889
-3.1700
-2.4889
-1.7957
-3.1700
-1.38
-3.1700
-3.1700
-2.4889
Lecture 6, lecture 7
Pattern Recognition
-ln ( Pi,j)
Pi,j
0.166 0.083 0.042
0.083
.166
0.042
0.25 0.042
.042 0.083
0.2980 0.2065
0.2065 0.2980
1.7957
2.4889
3.1700
2.4889
1.7957
3.1700
1.38
3.1700
0 3.1700
2.4889
0.1331
0
0
0
0.1331
0 0.3465
0.1331
0 0.1331
0.2065
Lecture 6, lecture 7
Pattern Recognition
0.2065
0.1331
0.2065
0.2980
0.1331
0.3465
0.1331
0.1331
0.2065
Pattern Recognition
GLCM Mean
GLCM Mean Equations
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Exercise:
For test image,
Calculate the mean of symmetrical
horizontal and Vertical GLCM.
Lecture 6, lecture 7
Pattern Recognition
(Pij)
0.166
0.083
0.042
0.083
.166
0.042
0.25
0.042
.042
0.083
Lecture 6, lecture 7
Pattern Recognition
= 1.292 = j
Lecture 6, lecture 7
Pattern Recognition
0.083
0.167
.083
0.083
0.083
0.083
0.083
0.083
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Properties of Variance
Variance is a measure of the dispersion of
the values around the mean.
It is similar to entropy. It answers the
question "What is the dispersion of the
difference between the reference and the
neighbor pixels in this window?"
Lecture 6, lecture 7
Pattern Recognition
i= 1.292
Variance (horizontal) =
0.166(0-1.292)2 + 0.083(0-1.292)2 + 0.042(0-1.292)2 + 0
+ 0.083(1-1.292)2 + 0.166((1-1.292)2 + 0
+0
+ 0.042(2-1.292)2 + 0
+ .250(2-1.292)2 + .042(21.292)2
+0
+0
+ .042(3-1.292)2 + .083(31.292)2
= 1.039067
Lecture
6, lecture 7 Professor Sayed Fadel Bahgat
Pattern Recognition
j= 1.162
Variance (vertical) =
0.250
0 0.083
0
0 0.167 .083
0
0.083 0.083 0.083 0.083
0
0 0.083
0
Lecture 6, lecture 7
Pattern Recognition
GLCM Correlation
The Correlation texture measures the
linear dependency of grey levels on those
of neighboring pixels.
GLCM Correlation equation:
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition
Lecture 6, lecture 7
Pattern Recognition