Beruflich Dokumente
Kultur Dokumente
Page 1
Lecture notes
Camera obscura
Since ancient times it has been known that a
brightly illuminated scene can be projected to
an image
In a dark room (Latin: camera obscura)
Through a small hole (aperture)
The image becomes rotated 180o
Jorma Kekalainen
104
Page 2
Lecture notes
Plenoptic function
At a point x = (x1,x2,x3) in space we can
measure how much light energy that travels in
the direction n = (n1,n2,n3), ||n|| = 1
The plenoptic function is the corresponding
radiance intensity function p(x,n)
A camera is a device that samples the
plenoptic function
Different types of cameras sample it in
different ways
Jorma Kekalainen
105
Jorma Kekalainen
106
Page 3
Lecture notes
107
Thin lenses
The simplest model of a lens
Focuses all points in an object plane onto the
image plane
Jorma Kekalainen
108
Page 4
Lecture notes
Object plane
The object plane consist of all points that appear
sharp when projected through the lens onto the
image plane.
The object plane is an ideal model of where the
sharp points are located
In practice the object plane may be non-planar: e.g.
described by the surface of a sphere
The shape of the object plane depends on the quality
of the lens (system)
109
Focal length
The thin lens is characterized by a single parameter:
the focal length fL
Jorma Kekalainen
110
Page 5
Lecture notes
Diffraction pattern
Due to the wave nature of light, even when
various lens effects are eliminated, light from
a single 3D point cannot be focused to an
arbitrarily small point if it has passed an
aperture.
For coherent light:
Huygens's principle: treat the incoming light as a
set of point light sources
Gives diffraction pattern at the image plane.
Jorma Kekalainen
111
Jorma Kekalainen
112
Page 6
Lecture notes
113
Superposition
The principle of superposition means that the
resulting wave-function at the image plane is a
sum/integral of the contributions from the
different light sources:
Jorma Kekalainen
114
Page 7
Lecture notes
Jorma Kekalainen
115
Airy disk
The smallest resolvable distance in the image
plane, x, is given by
116
Page 8
Lecture notes
Jorma Kekalainen
117
118
Page 9
Lecture notes
Lens distortion
A lens or a lens system can never map straight
lines in the 3D scene exactly to straight lines in
the image plane
Depending on the lens type, a square pattern
will typically appear like a barrel or a
pincushion
Jorma Kekalainen
119
Cos4 law
In general, there is an attenuation of the
image towards the edges of the image,
approximately according to cos4
This effect can be compensated in a digital
camera
Note: The flux density decreases with the square of the distance to the light source:
cos2 . The effective area of the detector relative to the aperture varies as cos . The
Jorma Kekalainen
Digital Image Processing
120
effective
area of the aperture relative
to the detector varies as cos
Page 10
Lecture notes
Chromatic aberration
The refraction index of matter (lenses) is
wavelength dependent
E.g., a prism can decompose the light into its
spectrum
A ray of white light is decomposed into rays of different
colors that intersect the image plane at different points
Jorma Kekalainen
121
Image formation
Jorma Kekalainen
122
Page 11
Lecture notes
Image formation
Jorma Kekalainen
123
Image formation
projection
through lens
image of object
Jorma Kekalainen
124
Page 12
Lecture notes
Capturing image
For natural images we need a light source (:
wavelength of the source) .
E(x; y; z; ): incident light on a point (x; y; z world
coordinates of the point)
125
Image formation
Camera(c(x; y; z; )) =
c=reflected light
Jorma Kekalainen
126
Page 13
Lecture notes
Jorma Kekalainen
127
Projection
There are two types of projections (P) of interest to us:
1. Perspective projection
Objects closer to the capture device appear bigger.
Most image formation situations can be considered to be under
this category, including images taken by camera and the human
eye.
2. Ortographic projection
This is unnatural.
Objects appear the same size regardless of their distance to the
capture device.
128
Page 14
Lecture notes
Perspective projection
Perspective projection: 1 = 2, l1 < l2 2 < 1.
Jorma Kekalainen
129
Jorma Kekalainen
130
Page 15
Lecture notes
131
Jorma Kekalainen
132
Page 16
Lecture notes
Image capturing
projection onto
discrete sensor
array
Jorma Kekalainen
digital camera
Digital Image Processing
133
Image capturing
sensors register
average color
Jorma Kekalainen
sampled image
134
Page 17
Lecture notes
Image capturing
continuous colors
discrete locations
Jorma Kekalainen
135
Sampled
Sampled and
quantized
Jorma Kekalainen
136
Page 18
Lecture notes
Quantization
Jorma Kekalainen
137
History of photography
Jorma Kekalainen
138
Page 19
Lecture notes
History of photography
1969: Boyle and Smith invent the first CCD chip for image
capture (based on the bubble memory)
1973: Fairchild Semiconductor markets the first CCD chip
(100 100 pixels)
1975: Bayer at Kodak: first single chip color CCD camera
1981: Sony markets the Mavica, the 1st consumer digital
camera. Stores images on a floppy disc
1986: Kodak presents the first megapixel CCD camera
2006: Dalsa Corporation presents a 111 Mpixel CCD
camera
2009: Kodak announces that it will discontinue production
of Kodachrome film
Jorma Kekalainen
139
Page 20
Lecture notes
Main effects
Main effects how the image is measured to
produce a digital image
The image is spatially sampled and truncated
Photons are converted to electric charge/voltage
The charges are converted to voltage
The voltage is quantized
Jorma Kekalainen
141
Photo-sensing chain
Jorma Kekalainen
142
Page 21
Lecture notes
143
144
Page 22
Lecture notes
Losses
When a photon enters the semiconductor
material, it may not interact as intended
1+2: reflection before
entering the active material
3+4: absorption before
entering the active material
5: absorption too deep in the material
6: the photon doesnt interact with the
material and exits at the back
Jorma Kekalainen
145
Quantum efficiency
All these effects are wavelength dependent
The mean number of electron/hole pairs
created per photon is the quantum efficiency
Jorma Kekalainen
146
Page 23
Lecture notes
Light electricity
Pure semiconductors can produce an electric
current Iphoto through the material if
It is embedded in an electric field
The material absorbs a photon with an energy
E=h which is larger than Qg, the gap between the
materials valence and conduction bands
Required:
Jorma Kekalainen
147
Intrinsic absorption
This is called intrinsic absorption
No doping of the semiconductor is needed
Near IR
Visible light
UV
X-ray
Jorma Kekalainen
148
Page 24
Lecture notes
Photovoltaic detectors
Absorption of light can also be based on the
photovoltaic effect
When photons of sufficiently high energy are
absorbed by a material, electrons are released and
produce a voltage
149
Thermal excitation
Because of heat in the material electrons are
always excited (moved from the valence band
to the conduction band) due to thermal
energy in the material.
This induces an electric current Ithermo
Jorma Kekalainen
150
Page 25
Lecture notes
Thermal noise
Ithermo is not a constant current, it is rather a
noise current of a mean given by the
expression
151
152
Page 26
Lecture notes
Photodiode
Electrons move from the
n+ region to fill holes in
the p-region
Jorma Kekalainen
153
Photodiode
Apply a bias voltage of
the same polarity as the
internal field.
The space-charge region
Increases.
Since the space-charge
region is an insulator, no
current runs through the
Junction.
154
Page 27
Lecture notes
Photodiode
Remove the voltage
the charge remains.
Jorma Kekalainen
155
Photodiode
The light creates a free
electron/hole pair.
Jorma Kekalainen
Page 28
Lecture notes
Photodiode
The voltage difference
generated by the photons
occurs even if the diode
had not have been
precharged.
Caused by the photovoltaic
effect.
The diode can in principle
be used as a solar cell.
The pre-charging makes
the photovoltaic effect
stronger since it increases
the space-charge region.
Jorma Kekalainen
157
158
Page 29
Lecture notes
MOS capacitor
The oxide layer is a
perfect insulator,
no current passes
through this layer.
Metal
Oxide
Semiconductor
Jorma Kekalainen
159
MOS capacitor
Apply a voltage across
the capacitor.
Holes in the region under
the oxide will move into
the substrate and create
a space-charge region.
An electric field is
created across the oxide
barrier and through the
space-charge region.
Jorma Kekalainen
160
Page 30
Lecture notes
MOS capacitor
An absorbed photon
creates an electron/hole
pair
The hole is swept into the
substrate.
The electron is drawn by
the electric field toward the
oxide barrier.
Due to the oxide
insulation, the electrons
accumulate below the
oxide barrier, no current
flows through the capacitor.
The amount of
accumulated electrons is
proportional to the number
of absorbed photons.
Jorma Kekalainen
161
162
Page 31
Lecture notes
Blooming
Both the photodiode and the MOS capacitor
collect electric charge in a small region
corresponding to the conductor region
When this region becomes saturated, the
charge spills over to neighboring elements
This is called blooming
Barriers between the detectors can reduce
this effect, but not eliminate it entirely
Jorma Kekalainen
163
Fill factor
In practice, the light sensitive area of an image
sensor cannot fill the entire detector area.
Electronic components and wiring reduce the
light sensitive area
The fill factor is the percentage of the total
area which is light sensitive
Light sensitive area
Total pixel area
Jorma Kekalainen
164
Page 32
Lecture notes
Micro-lenses
To overcome low fill factors, an array of microlenses in front of the sensor array can be used
165
166
Page 33
Lecture notes
Transport problem
Light has caused a change in electric voltage or charge in a light
detector element (photodiode or MOS capacitor), and this change
needs to be measured to produce an image
Traditionally not measured per detector element
Would require many components per detector
Would give too small fill factor for 2D arrays
Note: The fill factor is the percentage of the total area which is light sensitive.
167
Jorma Kekalainen
168
Page 34
Lecture notes
Limiting factors:
there is a maximal readout frequency from the entire array
this limits the readout speed from the individual pixel
169
Covered with an
opaque metal shield
Jorma Kekalainen
170
Page 35
Lecture notes
Disadvantages
It takes some time to shift the entire image from A to
B, during this time area A is still sensitive to light
after-exposure
Mechanical shutters can be used to remove afterexposure
Jorma Kekalainen
171
172
Page 36
Lecture notes
4T APS
Add a fourth transistor to each pixel (4T)
This transistor acts as a memory during readout
All photo-charge is moved globally to the memory
transistor after exposure
The other three transistors operate as a standard 3T
APS
One for recharging the diode
One for transforming charge to voltage
One for connecting the voltage to the readout row
173
Noise sources
Reset noise
The measured voltage depends on the fix bias
voltage over the photo diode or MOS capacitor
This voltage has always some amount of variation =
noise
174
Page 37
Lecture notes
Noise sources
The space-charge region is not a perfect
isolator there is a small leakage current
Called dark current since it discharge the
capacitor even when no photons are absorbed
Thermal noise
Can be reduced by cooling
Shot noise
Even if a constant number of photons hit the
photo detector, the absorption process is
probabilistic:
Each time we observe/measure the
voltage/charge difference at the detector, there
will a small variation in the result
This variation is the larger the shorter the
exposure time is, and vice versa
This noise has approximately a Poisson
distribution
Jorma Kekalainen
176
Page 38
Lecture notes
V
SNR = 20 log
V
It means that darker images have a lower SNR than brighter
images (assuming constant average noise)
The dynamic range is the SNR of the largest detectable signal
Vmax
V
DR = 20 log max
V
Jorma Kekalainen
177
Jorma Kekalainen
178
Page 39
Lecture notes
Digitalization
The analogue voltage signal is normally transformed to a digital
representation by means of an analog-to-digital converter (ADC)
Two common principles:
Flash ADC (up to 8 bits)
Successive approximation (>8 bits)
b=
DR
20 log(2 )
179
Flash ADC
Also called the parallel A/D converter
It is formed of a series of
comparators, each one comparing
the input signal to a unique
reference voltage.
The comparator outputs connect to
the inputs of a priority encoder
circuit, which then produces a
binary output.
Vref is a stable reference voltage.
As the analog input voltage exceeds
the reference voltage at each
comparator, the comparator
outputs will sequentially saturate
to a high state.
The priority encoder generates a
binary number based on the
highest-order active input, ignoring
all other active inputs.
Jorma Kekalainen
180
Page 40
Lecture notes
Jorma Kekalainen
181
Jorma Kekalainen
182
Page 41
Lecture notes
183
Image acquisition
Single imaging sensor
Jorma Kekalainen
184
Page 42
Lecture notes
Image acquisition
Combining a single sensor with motion to
generate a 2-D image.
Jorma Kekalainen
185
Color vision
The human eye has cones which are sensitive to
different wavelength bands
Jorma Kekalainen
186
Page 43
Lecture notes
187
Jorma Kekalainen
188
Page 44
Lecture notes
189
Jorma Kekalainen
190
Page 45
Lecture notes
191
Jorma Kekalainen
Extra green
192
Page 46
Lecture notes
Color post-processing
We can see the image detected by the sensor as a
mono-chrome signal (the raw image)
An RGB signal (3 components per pixel) is then
produced by interpolation from the raw image,
using different and space varying filters for each
of the three components (demosaicking)
Note: two types of filtering
An optical filter on the light before the sensor
An electronic filter on the image signal to produce
RGB signal
Jorma Kekalainen
193
Page 47
Lecture notes
Example
195
Color processing
The perception of color is complex
Humans tend to perceive color independent of
illumination
A color camera makes a measurement of physical
quantities: very dependent on illumination
White balancing
Transforms the color measurement to make what we
perceive as white to give equal RGB-values
Automatic or manual
196
Page 48
Lecture notes
Video camera
Basic idea: take one image after another in sequence
(temporal sampling)
Legacy television standards (PAL, NTCS,) require
interlaced video
Take one half-image with all odd rows and then another
half-image with all even rows, odd, even, etc.
Odd and even rows are exposed at different times
Motivation: better bandwidth usage in broadcasted TV
197
Progressive scan
E.g., one full image
at 25 Hz
Jorma Kekalainen
198
Page 49
Lecture notes
Jorma Kekalainen
199
Jorma Kekalainen
200
Page 50
Lecture notes
Rapid development
The technology related to image sensors is in
rapid development
The components are constantly becoming smaller
(Moores law)
New solutions to various problems appear at high
pace
More and more functionality is being integrated with
the image sensor
Image sensors are being integrated with other
functionalities (all kinds of supervision, control, and
surveillance anywhere and everywhere)
Jorma Kekalainen
201
Page 51