Sie sind auf Seite 1von 234

Digital Image Processing

Image Enhancement
(Histogram Processing)

Course Website: http://www.comp.dit.ie/bmacnamee


2
of
32
Come To The LABS!
Day: Wednesday
Time: 9:00 – 11:00
Room: Aungier St. 1-005
We will start by getting to grips with the
basics of Scilab
– Lab details available at WebCT
Shortly, there will be a Scilab assignment
which will count towards your final mark
3
of
32
Contents
Over the next few lectures we will look at
image enhancement techniques working in
the spatial domain:
– What is image enhancement?
– Different kinds of image enhancement
– Histogram processing
– Point processing
– Neighbourhood operations
4
of
32
A Note About Grey Levels
So far when we have spoken about image
grey level values we have said they are in
the range [0, 255]
– Where 0 is black and 255 is white
There is no reason why we have to use this
range
– The range [0,255] stems from display technologes
For many of the image processing
operations in this lecture grey levels are
assumed to be given in the range [0.0, 1.0]
5
of
32
What Is Image Enhancement?
Image enhancement is the process of
making images more useful
The reasons for doing this include:
– Highlighting interesting detail in images
– Removing noise from images
– Making images more visually appealing
Images taken from Gonzalez & Woods, Digital Image Processing (2002)
6
of
32
Image Enhancement Examples
Images taken from Gonzalez & Woods, Digital Image Processing (2002)
7
of
32
Image Enhancement Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)
8
of
32
Image Enhancement Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)
9
of
32
Image Enhancement Examples (cont…)
10
of
32
Spatial & Frequency Domains
There are two broad categories of image
enhancement techniques
– Spatial domain techniques
• Direct manipulation of image pixels
– Frequency domain techniques
• Manipulation of Fourier transform or wavelet
transform of an image
For the moment we will concentrate on
techniques that operate in the spatial
domain
11
of
32
Image Histograms
The histogram of an image shows us the
distribution of grey levels in the image
Massively useful in image processing,
especially in segmentation
Frequencies

Grey Levels
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
12
Histogram Examples
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
13
Histogram Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
14
Histogram Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
15
Histogram Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
16
Histogram Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
17
Histogram Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
18
Histogram Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
19
Histogram Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
20
Histogram Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
21
Histogram Examples (cont…)
22
of
32
Histogram Examples (cont…)
A selection of images and
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

their histograms
Notice the relationships
between the images and
their histograms
Note that the high contrast
image has the most
evenly spaced histogram
23
of
32
Contrast Stretching
We can fix images that have poor contrast
by applying a pretty simple contrast
specification
The interesting part is how do we decide on
this transformation function?
24
of
32
Histogram Equalisation
Spreading out the frequencies in an image (or
equalising the image) is a simple way to
improve dark or washed out images
The formula for histogram
equalisation is given where sk = T (rk )
– rk: input intensity k
– sk: processed intensity = ∑ pr ( r j )
– k: the intensity range j =1
(e.g 0.0 – 1.0) k nj
– nj: the frequency of intensity j =∑
– n: the sum of all frequencies j =1 n
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
25
Equalisation Transformation Function
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
26

1
Equalisation Examples
27
of
32
Equalisation Transformation Functions

The functions used to equalise the images


Images taken from Gonzalez & Woods, Digital Image Processing (2002)

in the previous example


Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
28

2
Equalisation Examples
29
of
32
Equalisation Transformation Functions

The functions used to equalise the images


Images taken from Gonzalez & Woods, Digital Image Processing (2002)

in the previous example


Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
30

4
3
Equalisation Examples (cont…)
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

of
32
31

4
3
Equalisation Examples (cont…)
32
of
32
Equalisation Transformation Functions

The functions used to equalise the images


Images taken from Gonzalez & Woods, Digital Image Processing (2002)

in the previous examples


33
of
32
Summary
We have looked at:
– Different kinds of image enhancement
– Histograms
– Histogram equalisation
Next time we will start to look at point
processing and some neighbourhood
operations
HISTOGRAM
TRANSFORMATION IN IMAGE
PROCESSING AND ITS
APPLICATIONS

Attila Kuba
University of Szeged
Contents
 Histogram
 Histogram transformation
 Histogram equalization
 Contrast streching
 Applications
Histogram
The (intensity or brightness) histogram shows how many
times a particular grey level (intensity) appears in an image.

For example, 0 - black, 255 – white

0 1 1 2 4 6
5

2 1 0 0 2 4
3
2
5 2 0 0 4 1
0

1 1 2 4 1 0 1 2 3 4 5 6

image histogram
Histogram II
An image has low contrast when the complete range of possible
values is not used. Inspection of the histogram shows this
lack of contrast.
Histogram of color images
RGB color can be converted to a gray scale
value by

Y = 0.299R + 0.587G + 0.114B

Y: the grayscale component in the YIQ color


space used in NTSC television.
The weights reflect the eye's brightness
sensitivity to the color primaries.
Histogram of color images II
Histogram:
individual histograms of red, green and blue

Blue
Histogram of
color images III
R

R B
R G
Histogram of color images IV
or
a 3-D histogram can be produced, with
the three axes representing the red,
blue and green channels, and
brightness at each point representing
the pixel count
Histogram transformation
Point operation T(rk) =sk

rk sk grey values:

Properties of T:
keeps the original range of grey values
monoton increasing
Histogram equalization (HE)

transforms the intensity values


so that the histogram of the
output image approximately
matches the flat (uniform) histogram
Histogram equalization II.
As for the discrete case the following formula
applies:

·(L-1)

k = 0,1,2,...,L-1
L: number of grey levels in image (e.g., 255)
nj: number of times j-th grey level appears in
image
?
n: total number of pixels in the image
Histogram equalization III
Histogram equalization IV
Histogram equalization V

cumulative histogram
Histogram equalization VI
Histogram equalization VII

HE
Histogram equalization VIII
histogram can be taken also on a part of the
image
Histogram projection (HP)
assigns equal display space to
every occupied raw signal level,
regardless of how many pixels are
at that same level. In effect, the raw
signal histogram is "projected" into a
similar-looking display histogram.
Histogram projection II

IR image

HE HP
Histogram projection III
occupied (used) grey level: there is at least
one pixel with that grey level

B(k): the fraction of occupied grey levels at or


below grey level k
B(k) rises from 0 to 1 in discrete uniform steps
of 1/n, where n is the total number of occupied
levels

HP transformation:

sk = 255 ·B(k).
Plateau equalization
By clipping the histogram count at a
saturation or plateau value, one can
produce display allocations
intermediate in character between
those of HP and HE.
Plateau equalization II

HE PE 50
Plateau equalization III
The PE algorithm computes the distribution not for the full image
histogram but for the histogram clipped at a plateau (or saturation) value
in the count.
When that plateau value is set at 1, we generate B(k) and so perform
HP;
When it is set above the histogram peak, we generate F(k) and so
perform HE.
At intermediate values, we generate an intermediate distribution which
we denote by P(k).

PE transformation:

sk = 255· P(k)
Histogram specification (HS)
an image's histogram is transformed
according to a desired function
Transforming the intensity values so that the
histogram of the output image
approximately matches a specified
histogram.
Histogram specification II
histogram1 histogram2

S-1 *T

T S

?
Contrast streching (CS)
By stretching the histogram we attempt to use
the available full grey level range.

The appropriate CS transformation :


sk = 255·(rk-min)/(max-min)
Contrast streching II
Contrast streching III

CS does not help here

?
HE
Contrast streching IV

CS

HE
Contrast streching V

CS
1% - 99%
Contrast streching VI

HE

CS
79, 136

CS
Cutoff fraction: 0.8
Contrast streching VIII

a more general CS:

0, if rk < plow
sk = 255·(rk- plow )/(phigh - plow ), otherwise
255, if rk > phigh
Contrast streching IX
Contrast streching X
Contrast streching XI
Applications
CT lung studies
Thresholding
Normalization
Normalization of MRI images
Presentation of high dynamic images (IR, CT)
CT lung studies

Yinpeng Jin HE taken in a part of the image


CT lung studies

R.Rienmuller
Thresholding
converting a greyscale image to a binary one

for example, when the histogram is bi-modal

threshold: 120
Thresholding II
when the histogram is not bi-modal

threshold: 80 threshold: 120


Normalization I
When one wishes to compare two or more
images on a specific basis, such as texture,
it is common to first normalize their
histograms to a "standard" histogram. This
can be especially useful when the images
have been acquired under different
circumstances. Such a normalization is, for
example, HE.
Normalization II
Histogram matching takes into account the
shape of the histogram of the original image
and the one being matched.
Normalization of MRI images
MRI intensities do not have a fixed meaning,
not even within the same protocol for the
same body region obtained on the same
scanner for the same patient.
Normalization of MRI images II

L. G. Nyúl, J. K. Udupa
Normalization of MRI images III
A: Histograms of 10 FSE PD brain 6000

volume images of MS patients. 5000

B: The same histograms after


scaling. 4000

C: The histograms after final 3000

standardization.
2000

A
1000

0
0 500 1000 1500 2000 2500

6000 6000

5000 5000

4000 4000

3000 3000

2000
2000

1000

B 1000
C
0
0 500 1000 1500 2000 2500 3000 3500 4000 4500 5000
0
0 500 1000 1500
L. G. Nyúl, J. K. Udupa
2000 2500 3000 3500 4000 4500 5000
Normalization of MRI images IV
unimodal bimodal

m1 p1 µ p2 m2 m1 p1 µ p2 m2

Method: transforming image histograms by landmark matching

Determine location of landmark µ i (example: mode, median,


various percentiles (quartiles, deciles)).
Map intensity of interest to standard scale for each volume image
linearly and determine the location µ ’s of µ i on standard scale.
Normalization of MRI images V
Applications III
A digitized high dynamic range image, such as an
infrared (IR) image or a CAT scan image, spans a
much larger range of levels than the typical values
(0 - 255) available for monitor display. The
function of a good display algorithm is to map
these digitized raw signal levels into display
values from 0 to 255 (black to white), preserving
as much information as possible for the purposes
of the human observer.
Applications IV
The HP algorithm is widely used by infrared
(IR) camera manufacturers as a real-time
automated image display.
The PE algorithm is used in the B-52 IR
navigation and targeting sensor.
Image Enhancement:
Histogram Based Methods
What is the histogram of a digital image?

• The histogram of a digital image with gray values r0 , r1 ,, rL −1


is the discrete function
nk
p(rk ) =
n
nk: Number of pixels with gray value rk

n: total Number of pixels in the image

• The function p(rk) represents the fraction of the total number of pixels with
gray value rk.
• Histogram provides a global description of the appearance of
the image.

• If we consider the gray values in the image as realizations of a


random variable R, with some probability density, histogram
provides an approximation to this probability density. In other
words,

Pr( R = rk ) ≈ p (rk )
Some Typical Histograms
• The shape of a histogram provides useful information for
contrast enhancement.

Dark image
Bright image

Low contrast image


High contrast image
Histogram Equalization
• What is the histogram equalization?

• Τ he histogram equalization is an approach to enhance a given image. The


approach is to design a transformation T(.) such that the gray values in the
output is uniformly distributed in [0, 1].

• Let us assume for the moment that the input image to be


enhanced has continuous gray values, with r = 0 representing
black and r = 1 representing white.

• We need to design a gray value transformation s = T(r), based


on the histogram of the input image, which will enhance the
image.
Assume that:
(1) T(r) is a monotonically increasing function for 0≤ r≤ 1
(preserves order from black to white).
(2) T(r) maps [0,1] into [0,1] (preserves the range of allowed
Gray values). ie 0 ≤T( r) ≤1 for 0 ≤r≤1
• inverse transformation is denoted by r = T -1(s) . We
assume that the inverse transformation also satisfies the above
two conditions.

• Consider the gray values in the input image and output


image as random variables in the interval [0, 1].

• Let pr(r) and p s (s) denote the probability density of the


Gray values in the input and output images.
• If and T(r) are known, and r = T -1(s) satisfies condition 1, we can write
(result from probability theory):

 dr 
ps ( s ) =  pr (r ) 
 ds  r =T −1 ( s )
• One way to enhance the image is to design a transformation
T(.) such that the gray values in the output is uniformly
distributed in [0, 1], i.e. ps (s) = 1, 0 ≤ s ≤ 1

• In terms of histograms, the output image will have all


gray values in “equal proportion” .

• This technique is called histogram equalization.


r
s = T (r ) = ∫ pr ( w)dw, 0 ≤ r ≤1
0

This is the cumulative distribution function (CDF) of pr (r) and satisfies the
previous two conditions.

• From the previous equation and using the fundamental


theorem of calculus,
ds
= pr (r )
dr
 1 
ps ( s ) =  pr ( r ) ⋅  = [ 1] r =T −1 ( s ) = 1, 0 ≤ s ≤1
 pr (r )  r =T −1 ( s )

• The output probability density function is uniform, regardless of the


input.

• Thus, using a transformation function equal to


the CDF of input gray values r, we can obtain
an image with uniform gray values.
• This usually results in an enhanced image, with an increase in the
dynamic range of pixel values.
How to implement histogram equalization?

Step 1:For images with discrete gray values, compute:

nk
pin (rk ) = 0 ≤ rk ≤ 1 0 ≤ k ≤ L −1
n
L: Total number of gray levels

nk: Number of pixels with gray value rk

n: Total number of pixels in the image

Step 2: Based on CDF, compute the discrete version of the previous


transformation :

k
sk = T (rk ) = ∑ pr (r j ) 0 ≤ k ≤ L −1
j =0
Example:

• Consider an 8-level 64 x 64 image with gray values (0, 1, …,


7). The normalized gray values are (0, 1/7, 2/7, …, 1). The
normalized histogram is given below:

NB: The gray values in output are also (0, 1/7, 2/7, …, 1).
# pixels Fraction
of #
pixels

Gray Normalized gray value


value
k
• Applying the transformation, sk = T (rk ) = ∑ pin (r j ) we have
j =0
• Notice that there are only five distinct gray levels --- (1/7, 3/7,
5/7, 6/7, 1) in the output image. We will relabel them as (s0,
s1, …, s4 ).

• With this transformation, the output image will have


histogram
Histogram of output
image

# pixels

Gray values

• Note that the histogram of output image is only approximately, and not exactly,
uniform. This should not be surprising, since there is no result that claims
uniformity in the discrete case.
Example Original image and its histogram
Histogram equalized image and its histogram
• Comments:
Histogram equalization may not always produce desirable
results, particularly if the given histogram is very narrow. It
can produce false edges and regions. It can also increase
image “graininess” and “patchiness.”
Histogram Specification
(Histogram Matching)
• Histogram equalization yields an image whose pixels are (in
theory) uniformly distributed among all gray levels.

• Sometimes, this may not be desirable. Instead, we may want a


transformation that yields an output image with a pre-specified
histogram. This technique is called histogram specification.
• Given Information

(1) Input image from which we can compute its histogram .

(2) Desired histogram.

• Goal

Derive a point operation, H(r), that maps the input image into an output
image that has the user-specified histogram.

• Again, we will assume, for the moment, continuous-gray values.


Approach of derivation

z=H(r) = G-1 (v=s=T(r))

Input image Uniform Output image


image
s=T(r) v=G(z)
• Suppose, the input image has probability density in p(r) . We
want to find a transformation z = H (r) , such that the probability density of the
new image obtained by this transformation is pout (z) , which is not necessarily
uniform.

• First apply the transformation


r
s = T (r ) = ∫ pin ( w)dw, 0 ≤ r ≤1 (*)
0
This gives an image with a uniform probability density.

• If the desired output image were available, then the following


transformation would generate an image with uniform density:

z
V = G ( z ) = ∫ pout ( w)dw , 0 ≤ z ≤1 (**)
0
• From the gray values ν we can obtain the gray values z by
using the inverse transformation, z = G-1 (v)

• If instead of using the gray values ν obtained from (**), we


use the gray values s obtained from (*) above (both are
uniformly distributed ! ), then the point transformation

Z=H(r)= G-1 [ v=s =T(r)]

will generate an image with the specified density out p(z) ,


from an input image with density in p(r) !
• For discrete gray levels, we have

k
sk = T (rk ) = ∑ pin (r j ) 0 ≤ k ≤ L −1
j =0
k
vk = G ( z k ) = ∑ pout ( z j ) = sk 0 ≤ k ≤ L −1
j =0

• If the transformation zk → G(zk) is one-to-one, the inverse


transformation sk → G-1 (sk) , can be easily determined, since
we are dealing with a small set of discrete gray values.

• In practice, this is not usually the case (i.e., ) zk → G(zk) is not one-to-one)
and we assign gray values to match the given histogram, as closely as
possible.
Algorithm for histogram specification:

(1) Equalize input image to get an image with uniform gray values,
using the discrete equation:

k
sk = T (rk ) = ∑ pin (r j ) 0 ≤ k ≤ L −1
j =0

(2) Based on desired histogram to get an image with uniform gray


values, using the discrete equation:

k
vk = G ( z k ) = ∑ pout ( z j ) = sk 0 ≤ k ≤ L −1
j =0

−1 −1
(3)
= → =
z G (v=s) z G [T ( r )]
Example:

• Consider an 8-level 64 x 64 previous image.

# pixels

Gray
value
• It is desired to transform this image into a new image, using a transformation
Z=H(r)= G-1 [T(r)], with histogram as specified below:

# pixels

Gray values
• The transformation T(r) was obtained earlier (reproduced
below):

• Now we compute the transformation G as before.


• Computer z=G-1 (s), Notice that G is not invertible.

G-1 (0) = ?

G-1 (1/7) = 3/7

G-1 (2/7) = 4/7

G-1 (3/7) = ?

G-1 (4/7) = ?

G-1 (5/7) = 5/7

G-1 (6/7) = 6/7

G-1 (1) = 1
• Combining the two transformation T and G-1 , compute z=H(r)= G-
1
[v=s=T(r)]
• Applying the transformation H to the original image yields an image with
histogram as below:

• Again, the actual histogram of the output image does not exactly but only
approximately matches with the specified histogram. This is because we are
dealing with discrete histograms.
Original image and its histogram

Histogram specified image and its histogram


Desired histogram
CIS 350 – 3

Image ENHANCEMENT
in the
SPATIAL DOMAIN
Part 2 Dr. Rolf Lakaemper
Most of these slides base on the
textbook

Digital Image Processing


by Gonzales/Woods
Chapter 3
Histograms

So far (part 1) :

• Histogram definition
• Histogram equalization

Now:

• Histogram statistics
Histograms

Remember:
The histogram shows the number of
pixels having a certain gray-value
number of pixels

grayvalue (0..1)
Histograms

The NORMALIZED histogram is the


histogram divided by the total number
of pixels in the source image.

The sum of all values in the normalized


histogram is 1.

The value given by the normalized


histogram for a certain gray value can
be read as the probability of randomly
picking a pixel having that gray value
Histograms

What can the (normalized)


histogram tell about the
image ?
Histograms

1.The MEAN VALUE (or average gray


level)

M = Σ g h(g) g

1*0.3+2*0.1+3*0.2+4*0.1+5*0.2+6*0.1=
0.3 2.6
0.2
0.1
0.0
1 2 3 4 5 6
Histograms

The MEAN value is the average gray


value of the image, the ‘overall
brightness appearance’.
Histograms

2. The VARIANCE

V = Σ (g-M)2 h(g)
g

(with M = mean)
or similar:

The STANDARD DEVIATION


D = sqrt(V)
Histograms

VARIANCE gives a measure about the


distribution of the histogram values
around the mean.

0.3 0.3
0.2 0.2
0.1 0.1
0.0 0.0

V1 > V2
Histograms

The STANDARD DEVIATION is a value


on the gray level axis, showing the
average distance of all pixels to the
mean

0.3 0.3
0.2 0.2
0.1 0.1
0.0 0.0

D1 > D2
Histograms

VARIANCE and STANDARD DEVIATION


of the histogram tell us about the
average contrast of the image !

The higher the VARIANCE (=the higher


the STANDARD DEVIATION), the
higher the image’s contrast !
Histograms

Example:

Image and blurred version


Histograms

Histograms with MEAN and


STANDARD DEVIATION

M=0.73 D=0.32 M=0.71 D=0.27


Histograms

Exercise:

Design an autofocus system for a digital


camera !

The system should analyse an area in the middle of the picture and
automatically adjust the lens such that this area is sharp.
Spatial Filtering

End of histograms.

And now to something completely


different …
Spatial Filtering

Spatial Filtering
Spatial Filtering

Spatial Filtering:

Operation on the set of ‘neighborhoods’


N(x,y) of each pixel

6 8 (Operator: sum)
12 200

6 8 2 0 226
12 200 20 10
Spatial Filtering

Neighborhood of a pixel p at position x,y is a


set N(p) of pixels defined relative to p.

Example 1:
N(p) = {(x,y): |x-xP|=1, |y-yP| = 1}

Q
Spatial Filtering

More examples of neighborhoods:

P P P P

P P
Spatial Filtering

Usually neighborhoods are used which are


close to discs, since properties of the
eucledian metric are often useful.

The most prominent neighborhoods are the


4-Neighborhood and the 8-Neighborhood

P P
Spatial Filtering

We will define spatial filters on the


8-Neighborhood and their bigger relevants.

P
P
P

N8

N24

N48
Spatial Filtering

Index system for N8:

n1 n2 n3

n4 n5 n6

n7 n8 n9
Spatial Filtering

Motivation: what happens to P if we apply the


following formula:

P = ∑ i ni

n1 n2 n3

n4 n5=P n6

n7 n8 n9
Spatial Filtering

What happens to P if we apply this formula:

P = ∑ i ai ni

with ai given by: a1=1 a2=1 a3=1

a4=1 a5=4 a6=1

a7=1 a8=1 a9=1


Spatial Filtering

Lets have a look at different values of ai and


their effects !
% Description: given an image 'im',
This MATLAB % create 12 filtered versions using
% randomly designed filters
program creates an
for i=1:12
interesting output: a=rand(7,7); % create a
%
random 7x7
% filter-matrix
a=a*2 - 1; % range: -1 to 1
a=a/sum(a(:)); % normalize
im1=conv2(im,a);% Filter
subplot(4,3,i);
imshow(im1/max(im1(:)));
end
Spatial Filtering
Spatial Filtering

Different effects of the previous slides


included:

• Blurring / Smoothing
• Sharpening
• Edge Detection

All these effects can be achieved using


different coefficients.
Spatial Filtering

Blurring / Smoothing
(Sometimes also referred to as averaging or lowpass-filtering)

Average the values of the center pixel and its neighbors:

Purpose:

• Reduction of ‘irrelevant’ details


• Noise reduction
• Reduction of ‘false contours’ (e.g.
produced by zooming)
Spatial Filtering

Blurring / Smoothing

1 1 1

1 1 1 * 1/9

1 1 1

Apply this scheme


to every single pixel !
Spatial Filtering

Example 2:
Weighted average

1 2 1

2 4 2 * 1/16

1 2 1
Spatial Filtering

Basic idea:
Weigh the center point the highest,
decrease weight by distance to center.

The general formula for weighted


average:

P = ∑ i ai ni / ∑ i ai

Constant value, depending on mask, not on image !


Spatial Filtering

Blurring using different


radii (=size of
neighborhood)
Spatial Filtering

EDGE DETECTION
Spatial Filtering

EDGE DETECTION

Purpose:

• Preprocessing
• Sharpening
Spatial Filtering

EDGE DETECTION

Purpose:

• Preprocessing
• Sharpening
Spatial Filtering

Motivation: Derivatives
Spatial Filtering

First and second order derivative


Spatial Filtering

First and second order derivative

1. 1st order generally produces thicker edges


2. 2nd order shows stronger response to detail
3. 1st order generally response stronger to gray level step
4. 2nd order produce double (pos/neg) response at step change
Spatial Filtering

Definition of 1 dimensional discrete 1st order derivative:

dF/dX = f(x+1) – f(x)

The 2nd order derivative is the derivative of the 1st order derivative…
Spatial Filtering

The 2nd order derivative is the derivative of the 1st order derivative…

F(x-1) F(x) F(x+1)

derive

F(x)-F(x-1) F(x+1)-F(x) F(x+2)-F(x+1)

F(x+1)-F(x) – (F(x)-F(x-1))

F(x+1)-F(x) – (F(x)-F(x-1))= F(x+1)+F(x-1)-2F(x)


Spatial Filtering

One dimensional 2nd derivative in x direction:

F(x+1)-F(x) – (F(x)-F(x-1))= F(x+1)+F(x-1)-2F(x)

1 -2 1

One dimensional 2nd derivative, direction y:


1

-2

1
Spatial Filtering

TWO dimensional 2nd derivative:

0 0 0 0 1 0 0 1 0

1 -2 1 + 0 -2 0 = 1 -4 1

0 0 0 0 1 0 0 1 0

This mask is called the ‘LAPLACIAN’


(remember calculus ?)
Spatial Filtering

Different variants of the Laplacian


Spatial Filtering

Effect of the Laplacian… (MATLAB demo)

im = im/max(im(:));

% create laplacian
L=[1 1 1;1 -8 1; 1 1 1];

% Filter !
im1=conv2(im,L);

% normalize and show


im1=(im1-min(im1(:))) / (max(im1(:))-min(im1(:)));
imshow(im1);
Spatial Filtering
Spatial Filtering

Edges detected by the Laplacian can be used to sharpen


the image !

+
Spatial Filtering
Spatial Filtering

Sharpening can be done in 1 pass:

0 -1 0 0 0 0 0 -1 0

-1 4 -1 + 0 1 0 = -1 5 -1

0 -1 0 0 0 0 0 -1 0

LAPLACIAN Original Image Sharpened Image


Spatial Filtering

Sharpening in General:

Unsharp Masking
and

High Boost Filtering


Spatial Filtering

Basic Idea (unsharp masking):

Subtract a BLURRED version of an


image from the image itself !

F sharp =F–F blurred


Spatial Filtering

Variation: emphasize original image


(high-boost filtering):

F sharp = a*F – F blurred , a>=1

0 -1 0 0 0 0 0 1 0

-1 a-1 -1 = 0 a 0 - 1 1 1

0 -1 0 0 0 0 0 1 0
Spatial Filtering

Different Examples of High-Boost


Filters:
0 -1 0
a=1
-1 0 -1

0 -1 0
0 -1 0
a=6
-1 5 -1

0 -1 0 0 -1 0
a=1e7+1
-1 1e7 -1

Laplacian + Image ! 0 -1 0
Spatial Filtering
Spatial Filtering

Enhancement using the First Derivative:


The Gradient

Definition: 2 dim. column vector,

∇ (f) = [G ; G ], G is 1st order derivative


x y

MAGNITUDE of gradient:

Mag(∇ (f))= SQRT(Gx2 + Gy2)


Spatial Filtering

For computational reasons the


magnitude is often approximated by:

Mag ~ abs(Gx) + abs(Gy)


Spatial Filtering

Mag ~ abs(Gx) + abs(G ) y

-1 0 1 -1 -2 -1

-2 0 2 0 0 0

-1 0 1 1 2 1

“Sobel Operators”
Spatial Filtering

Sobel Operators are a pair of operators !


Effect of Sobel Filtering:
Spatial Filtering

Remember the result some slides ago:

First and second order derivative

1.1st order generally produces thicker


edges
2. 2nd order shows stronger response to detail
3. 1st order generally response stronger to gray level step
4. 2nd order produce double (pos/neg) response at step change
Spatial Filtering

In practice, multiple
filters are combined to
enhance images
Spatial Filtering

…continued
Spatial Filtering
Spatial Filtering

Non – Linear Filtering:

Order-Statistics Filters

Median
Min
Max
Spatial Filtering

Median:
The median M of a set of values is such
that half the values in the set are less
than (or equal to) M, and half are
greater (or equal to) M.

1 2 3 3 4 5 6 6 6 7 8 9 9
Spatial Filtering

Most important properties of the


median:

• less sensible to noise than mean


• an element of the original set of
values
• needs sorting (O(n log(n))
Spatial Filtering

Median vs. Mean


Spatial Filtering

Min and Max

MIN
2 5 7 3

3 4 2 3

3 4 8 3 MAX

3 3 3 3
Spatial Filtering

Min and Max:

Basics of Morphological Filtering

(blackboard and MATLAB examples)


Image Enhancement

• Image enhancement techniques:


 Spatial domain methods
 Frequency domain methods

• Spatial (time) domain techniques are techniques that


operate directly on pixels.

• Frequency domain techniques are based on modifying


the Fourier transform of an image.

189
Fourier Transform

• ‘Fourier Transform’ transforms one function into another domain ,


which is called the frequency domain representation of the original
function
• The original function is often a function in the Time domain
• In image Processing the original function is in the Spatial Domain
• The term Fourier transform can refer to either the Frequency domain
representation of a function or to the process/formula that
"transforms" one function into the other.
Our Interest in Fourier Transform

• We will be dealing only with functions (images) of finite duration so we


will be interested only in Fourier Transform
Applications of Fourier Transforms

 1-D Fourier transforms are used in Signal Processing


 2-D Fourier transforms are used in Image Processing
 3-D Fourier transforms are used in Computer Vision
 Applications of Fourier transforms in Image processing: –
– Image enhancement,
– Image restoration,
– Image encoding / decoding,
– Image description
Discrete Fourier Transform (DFT)
• The discrete Fourier transform pair that applies to
sampled functions is given by:

M −1
1
F (u ) =
M
∑ f ( x) exp(− j 2πux / M )
x =0
u=0,1,2,…,M-1

and

M −1
f ( x) = ∑ F (u ) exp( j 2πux / M ) x=0,1,2,…,M-1
u =0

193
2-D Discrete Fourier Transform

• In 2-D case, the DFT pair is:


M −1 N −1
1
F (u , v) =
MN
∑∑ f ( x, y) exp(− j 2π (ux / M + vy / N ))
x =0 y =0

u=0,1,2,…,M-1 and v=0,1,2,…,N-1


and:
M −1 N −1
f ( x, y ) = ∑∑ F (u , v) exp( j 2π (ux / M + vy / N ))
u =0 v =0

x=0,1,2,…,M-1 and y=0,1,2,…,N-1


194
Polar Coordinate Representation of FT

• The Fourier transform of a real function is generally


complex and we use polar coordinates:

F (u , v) = R (u , v) + j ⋅ I (u , v )
Polar coordinate
F (u , v) = F (u , v) exp( jφ (u , v))
Magnitude: F (u, v) = [ R 2 (u , v) + I 2 (u , v )]1/ 2
 I (u , v) 
Phase: φ (u, v) = tan  −1

 R(u, v) 
195
Shifting the Origin to the Center

05/03/11 196
Shifting the Origin to the Center

05/03/11 197
Fourier Transform: shift
• It is common to multiply input image by (-1)x+y prior to
computing the FT. This shift the center of the FT to (M/2,N/2).

ℑ{ f ( x, y )} = F (u, v)
ℑ{ f ( x, y )( −1) x + y } = F (u − M / 2, v − N / 2)

Shift

198
Symmetry of FT

• For real image f(x,y), FT is conjugate symmetric:

F (u , v) = F (−u , −v)
*

• The magnitude of FT is symmetric:

F (u , v) = F (−u , −v)

199
FT

IFT

200
IFT

IFT

201
The central part of FT, i.e. the low
frequency components are
responsible for the general gray-level
appearance of an image.

The high frequency components of


FT are responsible for the detail
information of an image.

202
Image Frequency Domain
(log magnitude)
v Detail

General
appearance

203
5% 10 % 20 % 50 %

204
Frequency Domain Filtering

206
Frequency Domain Filtering

• Edges and sharp transitions (e.g., noise) in an


image contribute significantly to high-frequency
content of FT.
• Low frequency contents in the FT are
responsible to the general appearance of the
image over smooth areas.
• Blurring (smoothing) is achieved by attenuating
range of high frequency components of FT.

207
Convolution in Time Domain
g(x,y)=h(x,y)⊗f(x,y)
M −1 M −1
g( x , y ) = ∑ ∑ h( x' , y' ) f ( x − x' , y − y' )
x' = 0 y' = 0

≡ f ( x , y )* h( x , y )
– f(x,y) is the input image
– g(x,y) is the filtered
– h(x,y): impulse response

208
Convolution Theorem
Multiplication in Frequency
G(u,v)=F(u,v)⋅ H(u,v) Domain

Convolution in Time
g(x,y)=h(x,y)⊗f(x,y) Domain

• Filtering in Frequency Domain with H(u,v) is


equivalent to filtering in Spatial Domain with
f(x,y).

209
Examples of Filters

Frequency
domain

Gaussian lowpass filter Gaussian highpass filter


Spatial domain

210
Ideal low-pass filter (ILPF)

1 D(u , v) ≤ D0
H (u, v) = 
0 D(u , v) > D0
D(u , v) = [(u − M / 2) + (v − N / 2) ]
2 2 1/ 2

(M/2,N/2): center in frequency


domain
D0 is called the cutoff frequency.

211
Shape of ILPF

Frequency domain

h(x,y)

Spatial domain

212
FT

ringing and
blurring

Ideal in frequency
domain means
non-ideal in
spatial domain,
vice versa.

213
Butterworth Lowpass Filters (BLPF)
• Smooth transfer function, 1
no sharp discontinuity, H (u, v) = 2n
 D(u , v) 
no clear cutoff frequency. 1+  
 D0 

1
2

214
No serious
ringing
artifacts

215
Gaussian Lowpass Filters (GLPF)
• Smooth transfer
function, smooth D 2 ( u ,v )

impulse response, no H (u , v) = e 2 D 20

ringing

216
GLPF

Frequency
domain

Gaussian lowpass filter


Spatial domain

217
No ringing
artifacts

218
Examples of Lowpass Filtering

219
Examples of Lowpass Filtering

Low-pass filter H(u,v)

Original image and its FT Filtered image and its FT

220
High-pass Filters
• Hhp (u,v)=1-Hlp (u,v)

1 D(u, v) > D0
• Ideal: H (u, v) = 
0 D(u, v) ≤ D0

H (u, v) = 1
• Butterworth:  D 
2n
1+  0 

D(u, v) 
 

− D 2 ( u ,v ) / 2 D02
• Gaussian: H (u , v) = 1 − e

221
222
Butterworth High-pass Filtering

223
Gaussian High-pass Filtering

224
Ideal High-pass Filtering
ringing artifacts

225
Gaussian High-pass Filtering

Original image Gaussian filter H(u,v)

Filtered image and its FT

226
Example of Gaussian LPF and HPF

05/03/11 227
Homomorphic filters
• A simple image model: illumination–reflection model
– f(x,y) : the intensity is called gray level for monochrome image
– f(x,y)=i(x,y)*r(x,y)
– 0<i(x,y)<inf, the illumination
– 0<r(x,y)<1, the reflection
Image Enhancement - 3
Homomorphic filters (cont’)

• The illumination component


– Slow spatial variations
– Low frequency
• The reflectance component
– Vary abruptly, particularly at the junctions of dissimilar objects
– High frequency
• Homomorphic filters
– Effect low and high frequency differently
– Compress the low frequency dynamic range
– Enhance the contrast in high frequency
Image Enhancement - 3
Homomorphic filters (cont’)
Homomorphic filters (cont’)
• f(x,y)=i(x,y)*r(x,y)
• z(x,y)=ln f(x,y) = ln i(x,y) + ln r(x,y)
• F{z(x,y)} = F{ln i(x,y)} + F{ln r(x,y)}
• S(u,v) = H(u,v) I(u,v) + H(u,v) R(u,v)
• s(x,y) = i’(x,y) + r’(x,y)
• g(x,y) = exp[s(x,y)] = exp[i’(x,y)]exp[r’(x,y)]
Image Enhancement - 3
Homomorphic filters (cont’)
Image Enhancement - 3
Homomorphic filters - example

− c ( D 2 ( u ,v ) / D02
H (u, v ) = (γ H − γ L )[1 − e )] + γ L
End of Lecture

234

Das könnte Ihnen auch gefallen