Beruflich Dokumente
Kultur Dokumente
Course Outline
Introduction
Analog Vs. Digital Communication Systems A General Communication System
Outline (contd)
Analog-to-digital conversion
Sampling (ideal, natural, sample-and-hold) Quantization, PCM
Communication channels
Bandlimited channels The AWGN channel, fading channels
Costas N. Georghiades
Outline (contd)
Receiver design
General binary and M-ary signaling Maximum-likelihood receivers Performance in an AWGN channel
The Chernoff and union/Chernoff bounds Simulation techniques
Signal spaces Modulation: PAM, QAM, PSK, DPSK, coherent FSK, incoherent FSK
4
Costas N. Georghiades
Outline (contd)
Channel coding
Block codes, hard and soft-decision decoding, performance Convolutional codes, the Viterbi algorithm, performance bounds Trellis-coded modulation (TCM)
Outline (contd)
Signaling through fading channels
Rayleigh fading, optimum receiver, performance Interleaving
Synchronization
Symbol synchronization Frame synchronization Carrier synchronization
Costas N. Georghiades
Introduction
A General Communication System
Source Transmitter Channel Receiver User
Source: Speech, Video, etc. Transmitter: Conveys information Channel: Invariably distorts signals Receiver: Extracts information signal User: Utilizes information
Costas N. Georghiades
X
RF Oscillator
Receiver
Costas N. Georghiades
discrete alphabet
or
1
0110010... Transmitter
Costas N. Georghiades
s2(t) 0
+
Channel
r(t)
1 1? ?
( )dt
0
t=T
> <
1 0 0
Comparator
10
11
Synchronization
C h a n n e l
User
D/A Conversion
Source Decoder
Channel Decoder
Demodulator
Costas N. Georghiades
12
1) = {, } an algebra 2) = { , , {1} , {2} , {0} , {1,2} , {1,0} , { 0,2}} an algebra 3) = { , , {0} , {1} , {2}}
not an algebra
13
Costas N. Georghiades
Probability Measure
Definition: A class of subsets, , of a space is a -algebra (or a Borel algebra) if: 1) Ai Ai . 2) Ai , i = 1,2,3,... U Ai .
i =1
Definition: Let be a -algebra of a space . A function P that maps onto [0,1] is called a probability measure if: 1) P[ ] = 1 2) P[ A] 0
A .
3) P U Ai = P[ Ai ] for i =1 i =1
Costas N. Georghiades
Ai A j = , i j .
14
Probability Measure
Let = (the real line) and be the set of all intervals (x1, x2] in . Also, define a real valued function f which maps such that: 1) f ( x) 0 x .
2)
Then:
f ( x)dx = 1.
P { x ; x1 < x x 2 } = P ( x1 , x 2 ] =
] [
x2
x1
f ( x )dx
Costas N. Georghiades
15
Probability Space
The following conclusions can be drawn from the above definition: 1) P[ ] = 0
2) P A = 1 P[ A]
3) If A1 A2 P ( A1 ) P ( A2 )
[ ]
( P ( A + A ) = P ( ) = 1 = P ( A ) + P ( A )) .
4) P[ A1 A2 ] = P[ A1 ] + P[ A2 ] P[ A1 A2 ] .
Definition: Let be a space, be a -algebra of subsets of , and P a probability measure on . Then the ordered triple ( , , P ) is a probability space. sample space event space P probability measure
Costas N. Georghiades
16
( x ) d X = 1 as follows
( ) d .
17
Density Functions
We have the following observations based on the above definitions:
1) F X ( ) = 2) F X ( ) =
f X (x)d X = 0
f X (x)d X = 1
f X ( x) =
Costas N. Georghiades
1 2 2
( x )2
2 2
18
fX (x) 1 x 0 1
Costas N. Georghiades
19
Conditional Probability
Let A and B be two events from the event space . Then, the probability of event A, given that event B has occurred, P[A | B ] , is given by P[ A B] . P[ A| B ] = P[ B] Example: Consider the tossing of a dice: P [{2} | "even outcome"]=1/3, P[{2} | "odd outcome"] = 0 Thus, conditioning can increase or decrease the probability of an event, compared to its unconditioned value. The Law of Total Probability Let A1, A2,..., AN be a partition of , i.e.,
M
UA
i =1
= and Ai A j = i j .
P[ B] = P[ B| Ai ] P[ Ai ] , B .
i =1
Costas N. Georghiades
20
A3
B A1 A2
P ( B | A2 )
21
Costas N. Georghiades
0 P01 P 10 1
P00
1 Pr(1) = 2
P 11
P11=P[receive 1 | 1 sent]
Costas N. Georghiades
22
Bayes Law
Bayes Law: Let
P B| A j P A j
M i =1 i
][ ]
i
P[ B| A ] P[ A ]
Proof:
[ ] P[ A B] = P[ A | B] P[ B] = P[ B| A ] P[ A ] P[ A | B] = P[ B] P[ B| A ] P[ A ] P[ B| A ] P[ A ] P[ A | B] = =
P Aj B
j j j j j j j j j j
P( B )
P[ B| A ] P[ A ]
i =1 i i
Costas N. Georghiades
23
24
Expectation
Consider a random variable X with density fX (x). The expected (or mean) value of X is given by
E[ X ] =
xf
( x )dx .
g( x ) f
( x ) dx .
When g( X ) = X n for n = 0,1,2,L , the corresponding expectations are referred to as the n-th moments of random variable X. The variance of a random variable X is given by
var ( X ) = =
[ x E ( X )] f X ( x)dx
2
= E ( X 2 ) E 2 ( X ).
Costas N. Georghiades
x 2 f X ( x )dx E 2 ( X )
25
Example, Expectation
Example: Let X be Gaussian with
f X ( x) =
Then:
1 2 2
1 2 2
( x ) 2 exp 2 2
E( X ) =
xe
( x)2
2 2
dx = ,
Var( X ) = E [ X 2 ] E 2 ( X ) =
x 2 f X ( x )dx 2 = 2 .
26
Costas N. Georghiades
Random Vectors
Definition: A random vector is a vector whose elements are random variables, i.e., if X1, X2, ..., Xn are random variables, then X = ( X 1 , X 2 ,..., X n ) is a random vector. Random vectors can be described statistically by their joint density function
f X (x) = f X 1 X 2 ... X n ( x1 , x 2 ,L , x n ) .
Example: Consider tossing a coin twice. Let X1 be the random variable associated with the outcome of the first toss, defined by
1, if heads X1 = 0, if tails Similarly, let X2 be the random variable associated with the second tossing defined as 1, if heads X2 = . 0, if tails The vector X = ( X 1 , X 2 ) is a random vector.
Costas N. Georghiades
27
f X ,Y ( x , y ) = f X ( x ) f Y ( y ) .
The definition can be extended to independence among an arbitrary number of random variables, in which case their joint density function is the product of their marginal density functions. Definition: Two random variables X and Y are uncorrelated if
E [ XY ] = E [ X ] E [ Y ] .
It is easily seen that independence implies uncorrelatedness, but not necessarily the other way around. Thus, independence is the stronger property.
Costas N. Georghiades
28
j X
]= e
j x
f X ( x )dx .
Example: The characteristic function of a Gaussian random X variable having mean and variance 2 is
2
X ( j ) =
1 2
j x
( x )2
2 2
dx = e
1 j 2 2 2
sx
f X ( x ) dx .
Fact: The moment-generating function of a random variable X can be used to obtain its moments according to: d n X ( s) n |s = 0 E[ X ] = n ds
Costas N. Georghiades
29
Stochastic Processes
A stochastic process {X (t ); < t < } is an ensemble of signals, each of which can be realized (i.e. it can be observed) with a certain statistical probability. The value of a stochastic process at any given time, say t1, (i.e., X(t1)) is a random variable. Definition: A Gaussian stochastic process is one for which X(t) is a Gaussian random variable for every time t.
5 Fast varying Amplitude 0 -5 0 1 Amplitude 0 -1 0
0.2
0.4 Time
0.6
0.8
Slow varying
0.2
0.4
0.6
0.8
1
30
Costas N. Georghiades
Consider a stochastic process { X ( ); < < } . The random variable X(t), t , has a density function f X ( t ) ( x; t ) . The mean and variance of X(t) are
VAR[ X ( t )] = E ( X ( t ) X ( t ) )
xf
X (t )
( x; t )dx ,
2
].
Example: Consider the Gaussian random process whose value X(t) at time t is a Gaussian random variable having density x2 f X ( x; t ) = exp , 2 t 2t
0.3
0.2
0.1
t= 2
Costas N. Georghiades
0 -6
-4
-2
31
[(
)(
)]
t1 , t 2 . .
R XX (t1 , t 2 ) = E X (t1 ) X (t 2 ) ,
t1 , t 2 .
Definition: A process X is mean-value stationary if its mean is not a function of time. Definition: A random process X is correlation stationary if the autocorrelation function R XX (t1 , t 2 ) is a function only of = (t1 t 2 ) . Definition: A random process X is wide-sense stationary (W.S.S.) if it is both mean value stationary and correlation stationary.
Costas N. Georghiades
32
Spectral Density
Example: (Correlation stationary process)
X (t ) = a
],
= t1 t 2 .
Definition: For a wide-sense stationary process we can define a spectral density, which is the Fourier transform of the stochastic process's autocorrelation function:
SX ( f ) =
XX
( ) e j 2 f d .
The autocorrelation function is the inverse Fourier transform of the spectral density:
R XX ( ) =
S (f) e
X
j2 f
df .
S ( f ) df .
X
Costas N. Georghiades
33
SY ( f ) = S X ( f ) H ( f )
The spectral density at the output of a linear filter is the product of the spectral density of the input process and the magnitude square of the filter transfer function
34
Costas N. Georghiades
Costas N. Georghiades
Analog-to-Digital Conversion
r
Two steps:
Discreetize time: Sampling Discreetize amplitude: Quantization
A m p l i t u d e
0
Time
36
Costas N. Georghiades
Sampling
Signals are characterized by their frequency
content
The Fourier transform of a signal describes its frequency content and determines its bandwidth
x(t)
0.3
X ( f ) = x (t )e
j 2ft
X(f)
dt
0.5
Time, sec
x (t ) = X ( f )e j 2ft df
Costas N. Georghiades
Frequency, Hz
37
Ideal Sampling
Mathematically, the sampled version, xs(t), of signal x(t) is:
x s (t ) = h(t ) x (t ) X s ( f ) = H ( f ) X ( f ) ,
1 h (t ) = (t kTs ) = Ts k =
k =
j 2 k
t Ts
Sampling function
h(t)
...
-4Ts -3Ts -2Ts -Ts 0 Ts
xs(t)
...
2Ts 3Ts 4Ts
Ts
2 Ts 3 Ts 4 Ts
Costas N. Georghiades
38
Ideal Sampling
1 H ( f ) = Ts
Then:
1 X s ( f ) = H( f ) * X ( f ) = Ts k X f . Ts K =
K =
j2
kt
Ts
1 = Ts
k =
k . Ts
Aliasing
X(f)
X s ( f) fs < 2 W
...
-fs -W
(b )
...
W fs f
X s ( f)
No Aliasing
...
-fs -W
(a )
Costas N. Georghiades
fs > 2 W
...
W fs f
39
Ideal Sampling
If fs>2W, the original signal x(t) can be obtained from xs(t) through simple low-pass filtering. In the frequency domain, we have
X ( f ) = X s ( f ) G ( f ), where
T , f B G( f ) = s 0, oherwise. for W B f s W .
G( f )
sin(2 Bt ) . 2 Bt
Ts
B B f
40
Ideal Sampling
-B -W
G(f) T
W B
g(t)
t
The Sampling Theorem: A bandlimited signal with no spectral components above W Hz can be recovered uniquely from its samples taken every Ts seconds, provided that Nyquist 1 Ts , or, equivalent ly, f s 2W . Rate 2W Extraction of x(t) from its samples can be done by passing the sampled signal through a low-pass filter. Mathematically, x(t) can be expressed in terms of its samples by:
x (t ) = x (kTs ) g (t kTs )
k
Costas N. Georghiades
41
Natural Sampling
A delta function can be approximated by a rectangular pulse p(t)
It can be shown that in this case as well the original signal can be reconstructed from its samples at or above the Nyquist rate through simple low-pass filtering
42
1 T,T t T 2 2 p (t ) = 0, elsewhere.
h p (t ) =
k =
p(t kT )
s
Costas N. Georghiades
Zero-Order-Hold Sampling
x(t)
x s ( t ) = p ( t ) [x ( t ) h ( t ) ]
P(f)
Ts
t
f
x s (t )
1 P( f )
G( f )
x (t )
Equalizer
Costas N. Georghiades
Low-pass Filter
Example: Music in general has a spectrum with frequency components in the range ~20kHz. The ideal, smallest sampling frequency fs is then 40 Ksamples/sec. The smallest practical sampling frequency is 44Ksamples/sec. In compact disc players, the sampling frequency is 44.1Ksamples/sec.
Costas N. Georghiades
44
1 f = s T > 2W s
t
45
Costas N. Georghiades
x(t)
Amplitude still takes values on a continuum => Infinite number of bits Need to have a finite number of possible amplitudes => Quantization
46
Costas N. Georghiades
Quantization
Quantization is the process discretizing the amplitude axis. It involves mapping an infinite number of possible amplitudes to a finite set of values.
47
Example (Quantization)
Let N=3 bits. This corresponds to L=8 quantization levels:
x(t)
111 110 101 100 000 001 010 011
Costas N. Georghiades
48
Quantization (contd)
There is an irrecoverable error due to quantization. It can be made small through appropriate design. Examples:
Telephone speech signals: 8-bit quantization CD digital audio: 16-bit quantization
Costas N. Georghiades
49
Input-Output Characteristic
7 2
Output,
x = Q( x)
5 2
3 2
2
Input, x
3 2
Quantization Error:
d = ( x x ) = (x Q ( x ) )
5 2
7 2
Costas N. Georghiades
50
T E X (t ) dt
2 2
T 2
PX = E X 2
[ ]
T E [X (t ) Q ( X (t ))] dt
T 2
D = E ( X Q( X ))
Costas N. Georghiades
1/
/2
p(e)
/2 e
2 1 2 D = 2 e de = 2 12
= 2V / 2 N
= 6.02 N + 1.76 dB
52
PX SQNR = 10 log10 D
Costas N. Georghiades
Example
A zero-mean, stationary Gaussian source X(t) having spectral density as given below is to be quantized using a 2-bit quantization. The quantization intervals and levels are as indicated below. Find the resulting SQNR.
SX ( f ) = 200 RXX ( ) = 100 e 2 1 + (2f )
PX = RXX (0 ) = 100
-15 -10 -5 0 5 10 15
1 f X ( x) = e 200
x2 200
D = E ( X Q( X )) = 2
2
10
( x 5)
53
Non-uniform Quantization
In general, the optimum quantizer is non-uniform Optimality conditions (Lloyd-Max):
The boundaries of the quantization intervals are the mid-points of the corresponding quantized values The quantized values are the centroids of the quantization regions.
Optimum quantizers are designed iteratively using the above rules
We can also talk about optimal uniform quantizers. These have equal-length quantization intervals (except possibly the two at the boundaries), and the quantized values are at the centroids of the quantization intervals.
Costas N. Georghiades
54
Costas N. Georghiades
55
Companding (compressing-expanding)
Compressor
Uniform Quantizer
Low-pass Filter
1 0.8
Expander
- Law Companding :
ln (1 + x ) g ( x) = sgn( x ), ln (1 + ) 1 x 1
= 255 = 10
=0
0.2
0.4
0.6
0.8
g ( x) =
0.8
1 x 1
=0 = 10 = 255
0
Costas N. Georghiades
0.2
0.4
0.6
0.8
56
57
Data Compression
A/D Converter Analog Source Sampler Quantizer Source Encoder 001011001...
Discrete Source
Discrete Source 01101001... Source Encoder 10011...
The job of the source encoder is to efficiently (using the smallest number of bits) represent the digitized source
Costas N. Georghiades
58
For a memoryless source, the probability of a sequence of symbols being produced equals the product of the probabilities of the individual symbols.
Costas N. Georghiades
59
Measuring Information
Not all sources are created equal:
Example:
Discrete Source 1 P(0)=1
No information provided
P(1)=0 P(0)=0.99
Costas N. Georghiades
Definition: The average amount of information in bits/symbol provided by a binary source with P(0)=p
is
H ( x ) = p log 2 ( p ) (1 p ) log 2 (1 p )
61
H(x)
1
0.5
0.5
p
62
Costas N. Georghiades
Non-Binary Sources
In general, the entropy of a source that produces L symbols with probabilities p1, p2, ,pL, is
H ( X ) = pi log 2 ( pi ) bits
i =1
0 H ( X ) log 2 (L )
Equality iff the source probabilities are equal
Costas N. Georghiades
63
a b c d
0 11 100 101
1 2 3 3
3 3 1 1 M = mi pi = 1 + 2 + 3 + 3 8 8 8 8 i =1 = 1.875 bits/symbol
H ( X ) = pi log 2 ( pi ) = 1.811 bits/symbol
i =1 4
Costas N. Georghiades
64
bits contains 100,000 1s. This file can be compressed by more than a factor of 2:
65
Costas N. Georghiades
66
1 3 3 3 5 5 5 5
largest
1 1 0 1
.109 .271 .162
1.0
0 0
smallest
001 000
.009 .001
0 1
.01
.028
0 0
67
0 1 2 3 4 5 6 7 8
M 1 = 4 (1 0.43) + 1 0.43 = 2.71 M 2 = .1 + 2 .09 + 3 .081 + 4 .073 + 5 .066 + 6 .059 + 7 .053 + 8 .048 + 8 .430 = 5.710 M = M 1 2.710 = = 0.475 M 2 5.710
Costas N. Georghiades
68
Costas N. Georghiades
69
Costas N. Georghiades
70