Sie sind auf Seite 1von 15

GFM program

Exercise 5

Radiometric
corrections on
optical imagery

Tal Feingersh, April 2001


Adapted for ERDAS Imagine by
Valentyn Tolpekin, November 2004
Time
Two periods (4 hours)

Objectives
1. To get acquainted with atmospherically induced radiometric distortions.
2. To perform radiometric corrections (haze, sun-angle).
3. (Advanced) To perform sensor calibration (reflectance-to-radiance conversion).

Software
ERDAS Imagine 8.7

Data
Provided as additional material in Blackboard.
Contains:
1. Landsat-5 TM (1989) , bands 1,2,3,4 of the island of Ameland
2. Landsat-7 ETM+ (1999) , bands 1,2,3,4 of the island of Ameland
3. SPOT panchromatic of the island of Ameland*
4. A digital version of this document (rad cor.doc)**

* any set of images will do as long as sun-angle information is given for pan and they all show
the same area.

** for some operations needed in page 12.

1
Introduction

This exercise falls within the “rectification and correction” group of image processing
techniques, and focuses on radiometric corrections and noise removal. All data for
processing needs to be corrected for radiometric and geometric distortions. In the next
sections we will describe the datasets, which are available for this work, explain the relevance
of certain pre-processing steps and follow these for each input. The general, workflow of the
pre-processing stage of image data is outlined in figure 1. However, the dashed circle
indicates the focus of this exercise, and in it, the bold text indicates what we will practice
today. Part 1 of the exercise deals with optical data and part 2 of the exercise deals with
radar data. We will call optical data VNIR for simplification (VNIR = Visible and Infra-Red),
and radar data - SAR = Synthetic Aperture Radar. The two parts can be done separately as
two exercises. Other types of image preprocessing operations are not dealt-with here.
System corrections are assumed to be done.

Optical (VNIR)

System correction

Haze
Radiometric
Sunangle
corrections Skylight
(Other)

Geometric Relief distortion


Panoramic view
corrections Earth curvature

Resampling

Figure 1 A sequence of some possible corrections for input data

Before you start, it is a good idea to copy the data to your local hard disk, e.g. to a folder
like D:\radcor

2
Visible and Near InfraRed (VNIR) data

Two images are available, SPOT panchromatic and Landsat TM. All provide almost cloud-
free skies. TM’s repeat cycle is 16 days (at 705 km high), SPOT’s – 1 to 4 days, depending
on the look angle (remember - it is a pointable sensor). The TM system contains 7 sensors
sensitive to 7 radiometric channels respectively. Some properties of the original data are
listed in table 1 below.

Satellite / sensor File Date Resolution / Pixel Number of bands


name(s) size
SPOT PAN
3 July 1989 10 m (1)
panchromatic
Landsat-5 TM TM89 21 Dec1989 30 m 4 (in this exercise)
Landsat-7 ETM+ ETM99 3 Nov 1999 30 m 4 (in this exercise)
Table 1. VNIR data

The first four Landsat channels which we use in this exercise are Band1 covering 0.45 to
0.52 µm (Blue), Band2 covering 0.52 to 0.60 µm (green), Band3 covering 0.63 to 0.69 µm
(red), and Band4 covering 0.76 to 0.9 µm (InfraRed). TM and ETM+ channels (blue, green,
red and NIR) are identical.

Take a look at the images. Display maps PAN and ETM99 using Fit to Frame
display option. Position the map windows next to each other (use Tile Viewers
command under View menu). Link the viewers by choosing View \ Link/Unlink
Viewers \ Geographical and clicking on the second viewer to select it.

3
Display pixel information by choosing Utility \ Inquire Cursor. Two white
lines appear their intersection indicates the position of the selected pixel.
Check the pixel values of these images by moving the position of the cursor
with the mouse. You will see pixel values for 4 bands in case of
multispectral image and one pixel value for the panchromatic image. Real
intensities are indicated in the FILE PIXEL column, the LUT VALUE column
shows the values after contrast adjustment used for visualization purpose.
Pressing button will switch the viewer. Compare the pixel values in
different images.
Note: although it is usually better to correct geometrical distortions after
radiometric corrections, here the images are corrected geometrically for a
simpler comparison of exact pixels by their coordinates.

4
Close the Inquire Cursor Window and all Viewers.

5
Distortions and their corrections in VNIR data

The following corrections apply for distortions introduced by the atmosphere. Distortions
originating from the sensor system itself (dropped line replacement and de-banding) and
other not-absolutely-necessary radiometric corrections are out of the scope of this exercise.
You can find more reading material on these as reference at the end of the exercise. It is
important to understand that mostly, other corrections will be done by data providers. The
more your data is readily corrected and enhanced by the provider, the quicker you can get to
the analysis part of your work, and the more you pay.

Atmospherically induces distortions:


Corrections for haze, sun-angle and skylight

In all, radiometric corrections here include haze correction, sun angle correction and
skylight correction. An overall simplified illumination model is illustrated in figure 2. It has to
take the haze, the skylight and the sun angle into account, and their interaction with the
reflectance of: (1) the surface material, (2) the irradiance from the sun, and (3) the sensor
indicator as a function of the detected radiance by the sensor. This can be expressed as

D = (T + T i) Ri + Hi [1]

Where D is the detected radiance by the sensor, T is the irradiation from the sun, is the sun
angle, is the skylight, H is the haze, R is the reflectance from the sun and i is the sensor
indicator. Note that whereas skylight and sun-angle have a multiplicative effect, haze is an
additive effect, acting both ways from sun to sensor, through the atmosphere.

Figure 2 The effect of the atmosphere in determining various paths of energy that
illuminate a pixel and reach back to the sensor.

Haze has an additive effect to the overall image, resulting in higher DN values, and as
such, it is reducing the contrast. Since it is λ dependent, its impact differs per band, highest in
the blue range, lowest in the IR range. By correction, we estimate for each band the haze
contribution, and subtract this value from all measurements (pixels) in the relevant band. This
correction is always needed and done by either using a single band (by computing the
histogram and subtracting the minimum value from all the values, as shown in figure 3) or by

6
correcting the whole image at once (by looking for a blackbody like deep, clear, unpolluted
water, or shadow, and subtracting the expected value from all the pixels).

A B

212-48=164
freq. freq.

0 50 212 DN 2 164 DN

Figure 3 Haze Correction per band: (A) original histogram, (B) resulting change

As much as a minimum can be estimated (representing a black body), so can a maximum


be estimated, thus mapping the highest values of the histogram to a certain estimate. This
‘upper bound’ can be a thick white cloud for example. We limit our haze correction to the
‘lower bound’, as is shown in table 3. Deep water were selected and estimated by a scalar of
value 2, which was equal for all bands for simplification, to give

DN = Vw + Vh [2]

Where DN is the input value, Vw is the estimated lower bound (2 in our case, for deep water)
and Vh is the haze contribution.

Do it for three scenes: ETM99 (bands 1,2,3,4), TM89 (band 4 only), and SPOT
panchromatic (“PAN”) with the help of the following instructions.
• Open an image in a Viewer. Choose Display As Grayscale in the Raster Options and
choose the band of interest for multispectral image. Further check Clear Display, Fit to
Frame and No Stretch options.
• Open Inquire Cursor window. Note that it shows pixel values for all bands of the
multispectral image. Read the value for the proper band!
• Locate deep water in the sea (try to find the darkest part of the water)
• Locate the cursor in this deep area, the pixel values are shown. You may move the cursor
a little bit around to find the lowest value. We can assume that without an atmosphere it
would have been about DN = 2. Therefore you can retrieve Vh, per band, for all images.
Note down the lowest value that you can find in your chosen area. Zoom in if necessary.
Fill your observations in table 3 below (in gray cells only).

7
Band 1 Band 2 Band 3 Band 4 Panchromatic
Image
DN Vh DN Vh DN Vh DN Vh DN Vh
TM89 xx xx xx xx

ETM99 xx

PAN xx xx xx xx

Table 3. Haze contribution estimates over deep water, per spectral band.

Close the Inquire Cursor Window and all Viewers.

As you can see, there is a different atmospheric impact on different wavelengths. The longer
the wavelength (from 1 to 4) the less is the impact of the atmosphere. What about PAN?
What range on the spectrum does it cover? How does this relate to the observed difference
between ETM99 channels and the panchromatic channel? Think about it.

Correct the images for haze, by applying equation [2] above.


• Open a Viewer. Display an image you want to correct for haze with No stretch option.
Choose File \ Save \ Save Top Layer As… And save the image with different name (for
example, etm99h.img, tm89h.img and panh.img). You will apply the changes to this
image, leaving the original image not modified.
• Display the histrogram of the image. First, open the Image Infromation window by

pressing the button in the Viewer. In this window choose Histogram tab. You can
display histograms of different bands by selecting other Layer value. Examine the
histogram. Bring the cursor to the left part of the histogram and read the value
indicated. Is that value close to what you have found for this band before (Table 3)?
Why?
• In case of multispectral image open Raster \ Band Combinations and note which
bands are displayed as Red, Green and Blue channels. You may want to change the
arrangement of the bands.
• Use Raster \ Offset Values command to subtract the haze contribution from the
displayed bands. Note that positive values of the offset will increase the resulting pixel
values whereas the negative offset values will decrease them. Think what you shall fill
as offset values.
• You can not offset more than 3 bands at a time. When you are ready, apply the
changes and accept the warning. Close the viewer. Accept the suggestion to save
changes.
• You may want to repeat the procedure if you did not yet correct all bands of the image.
Apply the correction only to the band not corrected before!
• Open a new Viewer and display the same image again. Does it look different after the
correction?
• Display the Histogram of the corrected image (all bands). Did it change after applying
offsets?

8
• How can you explain the difference between the values of different channels?

• How can you explain the difference between the values of different scenes?

• Suppose we had a SPOT image with the channels that correspond to R, G, B as well.
Would we have then different values than those of the TM scene? Why (or why not)?

Sun angle correction

In case we do not have different dates for the same sensor, for the same area, another way
is to divide the (single) image values by the sine of its sun elevation angle (in degrees),
resulting in slightly higher values,
DN
DN' = [3]
Sin ( )

Where DN is the input pixel, DN’ is the output pixel, and is the sun angle. Note that since
the angle is smaller than 90° the sine will be smaller than 1 and as a result DN’ > DN.

OK. Now, to work. First, you will do it the absolute way of this correction, then the relative
way.

(The absolute way). Usually, you will find the sun elevation angle ( ) in the header file of
the images. Here it is provided for you.
• Calculate the sine of ( ) for the same PAN image you used earlier, fill your result in table
4.

Scene ( ) in degrees Sine ( )

PAN 58.9
Table 4. Sun angle correction-value for the image PAN.

• Open the panh.img file in the Viewer. Save it with the name pans.img.
• Applying correction for sun angle following equation [3] above. In the Raster menu choose
Contras \ General Contrast. In Contrast Adjust window set the Linear Method and input
the correction value as the Slope. This value will be used to multiply all pixel values. Set
the Histogram Source as Whole Image and Apply to: Image File. Apply the correction and
close the Contrast Adjust window. Close the Viewer. Accept the changes.

Now open input and output images in two Viewers side by side and compare the results.
Locate specific pixels (with the pixel information window) and examine the difference in
values. It might be however, that some values will not change, if the absolute difference
value is less than 1 (since pixel values are integers).

9
• Why do we apply haze correction before sun-angle correction? Does it make a difference?

In case of multi temporal analysis it is useful to have all images corrected for the same
illumination angle. In this method a stable ground cover is selected and its reflections are
plotted for the same band, from different dates (see figure 4). The regression of the plotted
set of values is then compared to the reference 1:1, 45° line. The assumptions applied are
corrected haze (the regression starts from the origin) and that the measured ratio is equal for
all the bands.

(The Relative way).

This approach makes sense only if you have more than one image of the same area, and
the images are from different dates. You have to realise that this correction will equalize sun
elevation angles of 2 images but will not make these angles equal to the zenith (90o)!

ref.
a

Date j
b

c
Date i

Figure 4 Sun angle correction. The ratio ac/ab is then a


measu re of erro r from a correct situation. The star t-like
shap e is the distribution of samples and its mean.

• Display both NIR images (from the different dates) in two Viewers side by side: TM894
and ETM994.
• Link the Viewers.
• From one of the Viewers start the Inquire Cursor.
• Retrieve the DN value objects that are less likely to change reflections between dates
(e.g. houses, clear road crossings etc.). It is very important that your sampling will be
accurate; therefore zoom-in if necessary to the pixel level. Take 5 samples (pixel values)
from each of the 2 dates you have.
• Careful now! (in the digital version of this document: “rad cor.doc”) Double–click figure 5
below. Once activated, fill in only the grey cells with the corresponding observations you
made for 5 pixels. The mean and the correction factor are calculated. Click outside the
figure area to terminate this operation. The diagonal arrow indicates the reference line (if
you can’t double click – fill manually).

10
Sample # TM894 ETM994
1
2 Ref. Image = #DIV/0!
3 Correct the image #DIV/0!
4
5 corr factor ( r ) = #DIV/0!
Mean = #DIV/0! #DIV/0!

250

200 Samples
Mean

150
ETM994

100

50

0
0 50 100 150 200 250
TM894

Figure 5 Sun angle correction from one image to another

• The ratio ac/bc is calculated as indicated in figure 4 earlier. In case the mean falls above
the reference line, the ratio is calculate with reference to the vertical axis (i.e. ETM99 will
be the reference image). Otherwise, TM89 will be the reference image.
• The calculated ratio value (r) is the value that should be used as a multiplier for all pixel
values in the image the should be corrected. That will result in increase of all values.
• In ERDAS Viewer of the image found to be a Reference image save it with a new name
(for example add letter s at the end of the image name).
• Apply found by you correction factor by using Linear Contrast (as before). Note that it will
multiply the pixel values in all bands of the image simultaneously.

• Why do we apply haze correction before sun-angle correction? Does it make a difference?

• Note that in case of multiband images, we do not need to calculate the sun-angle
separately for each of the bands. (Why not?)

11
Discussion

Usually, images are not taken when the sun is at the Zenith (90% above the ground, exactly
above the area of the image). This means we should choose one of the images to be the
reference image to which radiometric values of the other image are adjusted. It makes more
sense to use the one with the higher sun elevation angle (why?) and in fact, you can tell
which one has a higher sun elevation angle by simply looking at the histogram (how come?).
Looking at the scattergram above you can see that the date that has a higher value for the
mean is the date that has a higher sun elevation angle, and as such forms the reference
image. In fact, the ratio ac/bc (check figure 4 again) is also equal to the ratio of the higher
value of the mean over the lower value of the mean.

Example: if the mean is (190,125) the ratio will be 190/125=1.52 and that is the correction
factor. This is true, simply because 125 x 1.52=190. If you apply this factor not only to pixels
with value 125, but to all pixels, you calibrate one image to the values of the other (reference)
image. It doesn’t matter if the mean falls above the reference line, since the ratio will be the
same (ac/bc), but it will be projected this time on the Y-axis instead of the x-axis. Only this
time it will be smaller than 1. Take for example the mean (230,190). You can find the factor
and apply it to 190 to see it goes back to 230. It is, however, important to note that for some
cases the factor will cause some compression of radiometric content and so to some loss of
information. Take the first example (190,125). If the image that has the value 125 has also
pixels of value 170 and higher, once corrected for sun angle, values will exceed 256 and will
be assigned 256. There are ways for rescaling the image for that reason or applying the
factor in different ways, but it is out of the scope of this exercise.

Skylight correction

Skylight (or sky irradiance) reduces contrast. Figure 2 shows that sky irradiance has 2
components, coming from the path of the electromagnetic energy (component 1), and coming
from neighbouring pixels (component 2), respectively. In order to correct component 2 type
distortion we nee to know the influence of neighbouring pixels on a given pixel. The influence
will be a function of distance. In fact, in very high resolution imagery, it is assumed that there
is no influence beyond that of 4-connected neighbourhood structures, i.e. from those pixels
that share a side with the pixel of interest. As for component 1 type distortion, we need a
model that describes the composition of the atmospheric column (gases and aerosols) that
was above the area of the image, at the time of acquisition. If at all, you can find such
information in national meteorological centers, but usually you won’t be so lucky. Therefore,
this correction is mostly neglected. In general, you can reduce such affects by avoiding
clouded images for your analysis.

Reflectance and Radiance - (advanced section!)

Although DN values record the intensity of each spot viewed on the Earth’s surface, values
depend on the radiometric resolution of the sensor. For example, Landsat MSS sensors
measure radiation at 6-bit (64 grey levels) whereas the TM and ETM+ sensors use an 8-bit
range (256 grey levels). These values are proportional to upwelling electro-magnetic radiation
(radiance) but are not exactly the same. The true units of (spectral) radiance values are W m-

12
2
ster-1 m-1. Watts per square meter, per steradian (a three dimensional angle from a point
on the surface of the earth to the sensor), per unit wavelength are being measured.
When image processing is done (as is usually the case) on a single image, the true spectral
radiance of materials is not important. Therefore DN values can do. However, when spectral
signatures are considered, it will be interesting to get radiance values of materials for a more
robust analysis of the image. This is useful therefore in two types of situations. When (a)
comparing 2 images of the same area from different sensors (e.g. in a mosaic), and (b) when
using the same sensor to check a given phenomena – over a long period of time (months –
years).

So why don’t we always use radiance instead of reflectance? Radiance values are converted
by providers to reflectance since radiance depends on the degree of illumination of the object.
So, factors like time of the day, season, latitude etc. all make a difference. Reflectance
represents the ratio of radiance to irradiance and so it provides a standardised measure
which is comparable between images.

To convert data from reflectance (DN) to radiance, sensor changes (of a specific band)
should be calibrated. These changes occur over time and deteriorate detector performance in
a linear fashion. Similarly to correction of haze, bias is the shift from 0 radiance value and
gain is the slope of the conversion function. After such correction (called also sensor
calibration), lowest radiance will always translate to DN = 0 and highest radiance translates to
DN = 255. Bear in mind again that gain and bias differ per band since we are dealing with
separate detectors.

Radiance (L ) = (Gain x DN) + Bias

You could try that and see if values end up the same for the exact same pixels. Calibration
factors can be retrieved from the header file of an image. In this case, they where collected
for you and are given below. Apply sensor calibration for ETM+ and TM band 4 (NIR) as
follows:

ETM994rad = (1.496 * ETM994s) – 5.1

TM894rad = (0.965 * TM894h) + 2.56

Since all bands are corrected with the same parameters, you can apply this correction using
General Contrast option. Apply linear Contrast Adjustment where this time a shift is an offset
value.

Open both outputs and compare specific pixels with the pixel information window. Hopefully,
stable ground cover will give you very similar values. However, you can still find differences!
How come?

13
Summary

o Radiometric distortions originate, between other things, from the atmosphere, and
generally reduce the contrast of the image.

o Radiometric corrections are only one type of digital-image-processing.

o Atmospherically induces radiometric corrections are called haze, sun-angle and


skylight corrections. Haze is an additive distortion while skylight and sun-angle
distortions are multiplicative. Both methods can be used on one image or to equalize
images (relative correction).

o The one correction which is somewhat more complex to solve, is skylight, since we
need a model of the atmospheric column at the location and the time of acquisition, of
the specific image. This information is hard to find and, as a result, this correction is
mostly neglected.

o When dealing with analysis of spectral signatures it is worth to consider conversion


from reflectance to radiance values.

References

- Feingersh, T. (2000), Synergy of multi-temporal SAR and Optical imagery for crop
mapping. International Institute for Aerospace Survey and Earth Sciences (ITC),
Enschede, The Netherlands.

- Lillesand, T. M. and Kiefer, R. W. (1994), Remote sensing and image interpretation. 3rd
ed. Wiley & Sons, New York, 750 pages, (pp.531-536).

- Pohl, C. (1996), Geometric aspects of multisensor image fusion for topographic


updating in the humid tropics. (Chapter.5)

- UNESCO, 1999. Applications of satellite and airborne image data to coastal


management. Coastal region and small island papers 4, UNESCO, Paris, vi + 185 pp.

- http://ltpwww.gsfc.nasa.gov/IAS/handbook/handbook_htmls/chapter6/chapter6.html#s
ection6.4
- http://landsat.usgs.gov/technical.html

14

Das könnte Ihnen auch gefallen