Beruflich Dokumente
Kultur Dokumente
- 1
CME620
2016
Week
1
f X ( x)
1
2
b g
0607
. a
X ~ N ,
3
px(k)
Lecture Guide
Topics
1.
2.
3.
4.
Course Introduction
Set Theory and Venn Diagrams
Unions, Intersections, Compliments, etc.
Probability Theory
Probability Space and Probability Measure
Axioms of Probability
Conditional Probability
Independence of Events (mutually exclusive events)
Partition- Law of total probability
Bayes' Rule
2
1. Definition and Characterization of One Random Variable
Probability Distribution Function (cdf) and their properties
Probability Density Function (pdf) and their properties
Probability Mass Function (pmf) and their properties
2. Conditional distributions and densities
3. Important Random Variables (Discrete and Continuous)
Discrete Binomial, Bernouli, Poisson, Hypergeometric,
- Uniform,
Exponential,
Rayleigh, Nakagami, 2
(c) Prof. Okey Ugweje Continuous Federal
University
of Technology,Gaussian,
Minna
Set Theory - 1
Set Theory - 2
Definition:
A set is a collection of distinct objects called elements
Usually written as a list of elements enclosed in
brace { }
Since elements must be distinct, 2 or more
elements in a set cannot be the same
Example 1:
{1,2,3} is a valid set whereas {1,1,3} is not
Set can be made up of elements which are
themselves sets
Set can be finite or infinite
Example 2:
The set of all positive integers {0,1,2,3, } is countably
infinite, whereas the set of all real number [0,1] is
uncountably infinite
All sets are subsets of the sample space
Definition:
The union of two sets A and B (denoted as A B) is a
set that contains all elements in either A or B
A B | A or B
For more than two elements
Set Theory - 3
Example 3:
If A = {1,2,4}, and B = {1,3,5}, then A B = {1,2,3,4,5},
Definition:
A set A is a subset of a set B (denoted as A B) if all the elements
of the set A are also in the set B.
Example 5:
Set A = {1, 2} is a subset of set B = {1, 2, 3, 5}
Definition:
The intersection of two sets A and B (denoted as A B) is a
set that contains only the elements that appear in both sets
A B | A and B
n
i 1
i 1
Ai Ai
Example 6:
If ={1, 2, 3, 4, 5}, the complement of the set B = {1, 2, 3},
is the set Bc = {4, 5}
Example 4:
If A = {1, 2, 4}, and B = {1, 3, 5}, then A B = {1},
(c) Prof. Okey Ugweje
Set Theory - 4
n
n
Ai Ai
i 1
i 1
Set Theory - 5
Set Theory - 6
Set Operators:
= universal set
= null set
= union
= intersection
, = subsets
element of
Venn Diagrams - 1
10
Venn Diagrams - 2
Department of Telecommunications Engineering
= S
Ak A1 A2 Ak ... Ak
k 1
k 1
A A
A A A
A S S
A A S
A B C A B C
A B A if B A
11
12
Venn Diagrams - 3
Venn Diagrams - 4
Intersection (Product)
Elements common to all sets
Elements contained in all sets
Events occur in all experiment
Series systems
A
B
C
AB
Ak A1 A2 Ak Ak
k 1
k 1
S
A
n
n
Ak A1 A2 Ak Ak
k 1
k 1
ABC
13
A A A
A S A
A A
A B C A B C
Venn Diagrams - 5
i 1
A1
B
Aj
A2
Ai
An
A A S
A A
A A
A B A B
A B A B
B
Ac
S , S
Difference
A B
Ac
14
Venn Diagrams - 6
mutually
exclusive
B
B-A
S
15
16
Venn Diagrams - 7
Subsets
B
A
S
B
C
EF
AB
ABC
De-Morgan's Law
A B A B ;
A
A B
(c) Prof. Okey Ugweje
A B
Ec
A B A B
A
E G
A
n
A B
i 1
17
i 1
F G
Aic ;
k 1
E F G
E G F G
Bk Bkc
k 1
Federal University of
18 Technology, Minna
Example 8
Federal University of
19 Technology, Minna
Page 18
Page 19
PZ
NZ
NS
25
130
0. 192
Federal University of
20 Technology, Minna
Page 20
Probability Theory - 1
Department of Telecommunications Engineering
21
Probability Theory - 2
22
Some Applications - 1
For example
Data sent through a communication system is
random since the outcome at the receiver is not
certain
Noise, interference and fading introduced by the
channel are random processes and can only be
modeled as such
The measure of performance (e.g., Bit Error Rate)
is probabilistic since it is an estimate of the received
signal compared to the transmitted signal
System
Output Signal
23
24
Some Applications - 2
Some Applications - 3
s(t) + n(t)
System
Output Signal
Input
Output Signal
Noise n(t) is almost always random in nature and calls for the
use of probabilistic methods even if the signal s(t) is not, e.g.,
Thermal noise
Thermal motion of the conduction electrons in the amplifier
input circuit
Random variations in the number of electrons (or holes)
passing through a transistor
Since there are millions of electrons, one cannot calculate
the value of this kind of noise at every instant of time, but
can calculate:
Noise n(t) is almost always random in nature and calls for the
use of probabilistic methods even if the signal s(t) is not, e.g.,
Thermal noise
Thermal motion of the conduction electrons in the amplifier
input circuit
Random variations in the number of electrons (or holes)
passing through a transistor
Since there are millions of electrons, one cannot calculate
the value of this kind of noise at every instant of time, but
can calculate:
25
Some Applications - 4
26
Some Applications - 5
Quality Control
An important method of improving system reliability
is to improve quality of the individual elements.
This is often done by an inspection process since it
will be too costly to inspect every element
27
28
Some Applications - 6
Department of Telecommunications Engineering
Probability Concepts
We see that the theory of probability is at heart
only common sense reduced to calculations ...
- Laplace Pierre Simon
29
Probability Concepts
Probability Spaces
30
31
32
Probability Spaces
33
Tossing of 2 Dice
a) Dice are distinguishable
S1 = {(1,1), (1,2), , (1,6); (2,1); (2,2), , (2,6);
(3,1); (3,2), , (3,6); (4,1); (4,2), , (4,6);
(5,1); (5,2), , (5,6); (6,1); (6,2), , (6,6)}
= {6}+{6}+{6}+{6}+{6}+{6} = 36 elements (or 62)
b) Dice are indistinguishable
S2 = {(1,1), (1,2), , (1,6); (2,1); (2,2), , (2,6);
(3,1); (3,2), , (3,6); (4,1); (4,2), (4,3), , (4,6);
(5,1); (5,2), , (5,6); (6,1); (6,2), (6,3), (6,4), (6,5), (6,6)}
= {6}+{5}+{4}+{3}+{2}+{1} = 21 elements
c) May also use Tabular method
(c) Prof. Okey Ugweje
34
Event - 1
Department of Telecommunications Engineering
A
Definition:
An Event, A, is a set of outcomes;
a subset of the sample space
Event is any possible outcomes of
an experiment. It is the simplest random phenomenon
Event is usually known as the information space
Each Event has associated quantity which characterizes the
objective likelihood of occurrence of that event
That quantity is the probability of the event
(c) Prof. Okey Ugweje
36
Example 11 - Events
Event - 2
Special events
There are two special events of interest:
1) Universal Set ( or S)
Set containing all elements
The totality of all elementary event i, known a priori,
1 , 2 , , k ,
38
Definition of Probability
Department of Telecommunications Engineering
Axioms of Probability
The theory of probability as a mathematical
discipline can and should be developed from axioms
in exactly the same way as geometry and algebra.
Andrey Kolmogorov
(c) Prof. Okey Ugweje
39
40
Axioms of Probability - 1
Axioms of Probability - 2
Note:
(iii) states that if A and B are mutually exclusive (M.E.)
events, the probability of their union is the sum of their
probabilities, i.e.,
P A B P A P B ,
If A and B cannot occur simultaneously
This is the minimum number of axioms required to
establish the remaining concept of probability.
These axioms allow us to view events as object
with properties.
k 1
k 1
P[ Ak ] P[ Ak ]
(c) Prof. Okey Ugweje
41
Axioms of Probability - 3
P A A P 1 .
P A A P A P A 1
A B A AB ,
where A and A B are clearly M.E. events.
P A 1 P A .
Federal University of Technology, Minna
42
Axioms of Probability - 4
b) Similarly, for any A, A .
Hence it follows that P A P ( A ) P ( ) .
But A A , and thus P 0 .
or
43
A B
44
Axioms of Probability - 5
Axioms of Probability - 6
( B A) ( B A) BA B A
Corollary 1:
P B P ( BA) P ( B A),
P[Ac] = 1 - P[A]
P[S] = P[A Ac] = P[A] + P[Ac] = 1
Thus P AB P ( B ) P ( AB )
since BA AB and B A AB are M.E. events.
Hence,
P A B P ( A) P ( B ) P ( AB ).
and using other relations
(c) Prof. Okey Ugweje
Corollary 2:
45
Axioms of Probability - 7
46
Axioms of Probability - 8
Corollary 3:
Corollary 5:
P[AB] = P[A] + P[B] - P[A B]
AcB
Corollary 4:
k 1
A B = A (Ac B)
P[A B] = P[A] + P[Ac B]
P[A] = P[A B] - P[Ac B];
Substituting in will yield
P[A B] = P[A] + P[B] - P[A B]
AB
A+B
PL A O P[ A ], n 2
MN PQ
k 1
47
B = (A B) (Ac B)
P[B] = P[A B] + P[Ac B]
P[Ac B] = P[B] - P[A B]
48
Axioms of Probability - 9
Axioms of Probability - 10
P[ABC] = P[(AB) C]
= P[AB] + P[C] - P[(AB) C]
= P[A] + P[B] - P[AB] + P[C] - P[AC] - P[BC]
= P[A] + P[B] + P[C] - P[AB] - P[AC] - P[BC]
+ P[ABC]
49
Axioms of Probability - 11
Example 12
Corollary 6:
In general, for n events,
n
n
P A P[ Ak ] P[ A j Ak ] ... ()n 1 P[ A1 ... An ]
k 1 k k 1
jk
P[ A j ]
j 1
Corollary 7:
If A B, then P[A] P[B]
B = A (Ac B)
AB
P[B] = P[A] + P[Ac B]
A
B
S
P[A], since P[Ac B] 0
These axioms and corollaries provide us with the rules (or
law) for computing the probability of events
c
50
51
52
Example 13
Probability Problems - 1
53
Probability Problems - 2
Department of Telecommunications Engineering
Probability Problems - 3
Department of Telecommunications Engineering
Federal University of
54 Technology, Minna
y
x
y
x
Federal University of
56 Technology, Minna
Probability Problems - 4
Example 14
L( A)
L(S ) , Length
( A)
, Area
P[ A]
(S )
V ( A)
V (S ) , Volume
Solution
Let S = {all possible occurrence}={36}
(a)
(b)
E ={sum is 8}
F ={sum is 5}
T ={sum is 7}
5
4
6
P E ; P F ; P T ;
36
36
36
Federal University of
57 Technology, Minna
4
F
F
F
F
T
T
E
T
E
T
E
6
T
E
T
E
58
Example 16
Department of Telecommunications Engineering
1 2 3 4 5
1
O
O
2 O E O E O
3
O
O
4 O E O E O
5
O
O
6 O E O E O
6
O
E
O
E
O
E
18
P O
36
4 6 5 15
36 36 36 36
Example 15
Department of Telecommunications Engineering
(b)
1
2
3
4
5
6
P F , T or E P F T E P F P T P E
59
2
2
P A B ; P B C .
8
8
1
P A B C ;
8
(c) Prof. Okey Ugweje
1
2
3
4
5
6
7
8
X
0
0
0
0
1
1
1
1
Y
0
0
1
1
0
0
1
1
Z
0
1
0
1
0
1
0
1
OUTCOMES
B
B
A
A
A
A
B
B
C
C
60
Conditional Probability - 1
Department of Telecommunications Engineering
Conditional Probability
Theory
We define
61
Conditional Probability - 2
P (( A C ) B ) P ( AB CB )
.
P(B)
P( B)
But AB BC , hence P ( AB CB ) P ( AB ) P ( CB ).
P[A|C]
P[C]
P ( AB ) P (CB )
P ( A | B ) P (C | B ),
P( B)
P( B)
satisfying all probability axioms.
P( A C | B)
P[D]
Properties:
BC
P[A|D]
AD
P[B|D]
BD
P AB
P B
1,
P B
P B
P[B|C]
P[A/B] is small
AC
1. If B A, AB = B, then P A|B
62
P( A C | B)
provided P(B) 0.
Conditional Probability - 3
P[A|B] is large
P[ AB ]
,
P[ B ]
1,
since B = B.
(ii)
P[ B ]
P[ B ]
P[ A | B ]
63
64
Conditional Probability - 4
Conditional Probability - 5
2. If A B, AB = A, and
P AB
P A
P A|B
P A ,
P B
P B
Ai .
i1
65
i 1
Example 17
Solution
1
3
P[ A B ]
4
a) Find P[A|B] P[ A | B]
P[ B ]
c) Find P[AB]
67
P[ A B ] P[ A] P[ B ] P[ A B ]
1
P[ B A]
1
4
1
2
P[ A]
2
b) Find P[B|A] P[ B | A]
d) Find P[Ac|Bc]
66
Example 17
1 1 1 7
2 3 4 12
P[ Ac | B c ]
P[ Ac B c ]
P[ B c ]
Federal University of
68 Technology, Minna
Example 17
Department of Telecommunications Engineering
But
P[ B c ] 1 P[ B ] 1
Example 18
Department of Telecommunications Engineering
1 2
3 3
A B c Ac Bc P Ac Bc P A B c
7
5
c
P A B 1 P A B 1
12 12
e) Find P[Bc|Ac]
P[ B c | Ac ]
5
5
P[ B c Ac ]
12
c
1
6
P[ A ]
2
Federal University of
69 Technology, Minna
Federal University of
70 Technology, Minna
Example 18
Department of Telecommunications Engineering
Example 18
Department of Telecommunications Engineering
Let
C+ = {has Cancer};
C- = {no Cancer};
R = {positive reaction}
Therefore,
P[R|C+] = 0.9; P[R|C-]= 0.05;
P[C+] = 0.01; P[C-] = 0.99
P R | C
P R | C
(c) Prof. Okey Ugweje
P S
Federal University of
71 Technology, Minna
P S
P C | R) P R
P R | C P C P R | C P C
(0.9)(0.01)
(0.9)(0.01) (0.05)(0.99)
0.154
S R C R C
P C | R) P R
P C | R) P R
P R | C
(c) Prof. Okey Ugweje
P C | R) P R
P S
Federal University of
72 Technology, Minna
Example 18
Department of Telecommunications Engineering
P R | C
P R | C ) P R
P S
P R | C ) P R
Independence
P R | C P C P R | C P C
(0.9)(0.01)
(0.9)(0.01) (0.05)(0.99)
0.154
Paul Harvey
Federal University of
73 Technology, Minna
Independence - 1
Independence - 2
P [ AB ] P [ A ] P [ B ]
It is easy to show that if A, B are independent, then
AB ; A , B ; A , B
are all independent pairs.
74
75
P[ A | B ]
P[ A B ]
P[ A ]P[ B ]
P[ A]
P[ B ]
P[ B ]
76
Independence - 3
Example 19
P[ A B ] P[ A | B ]P[ B ]
P[ A B ] P[ B | A]P[ A ]
P[ B A]
P[ A B ]
,
P[ A]
P[ A]
P [B |A [
P [A |B ]=
P [A ] B a y e s ' T h e o r e m
P [B ]
P[ B | A]
suit
Diamond
suit
Spade
suit
Heart
suit
10
11
12
13
Club
Jack
King
Queen
Ace
(c) Prof. Okey Ugweje
77
Federal University of
78 Technology, Minna
Example
Example
For each suit the sample space consist of ace, two, ...,
ten, jack, queen, king and it is indicated as {1, 2, ..., 13}
Let A = {king is drawn}, B = {club is drawn}
Describe the events
a) A B = {either king or club (or both i.e., king of
clubs)}
b) A B = {both king and club (king of clubs)}
c) Since B = {clubs}, Bc, = {not club} = {hearts, diamond,
spade}.
Federal University of
79 Technology, Minna
Page 79
Federal University of
80 Technology, Minna
Page 80
Example
Example
Solution:
P[ A]
Also
4
8
13
; P[ B ] ; P[C ] ;
52
52
52
P[ A B ] 0
Therefore
1
2
P[ A C ] ; P[ B C ] ;
52
52
1
1
P[ A] P[C ] ;
52
52
2
2
P[ B C ]
P[ B] P[C ] ;
52
52
P[ A C ]
32
;
52 52
Federal University of
81 Technology, Minna
Federal University of
82 Technology, Minna
Partition Law
Bayes Rule
Laws of Total Probability
Introduction to Markov Chains
Counting Techniques
Sampling of Different Kinds
1. Sampling with replacement and with ordering
2. Sampling without replacement and with ordering
3. Sampling without replacement and without ordering
4. Sampling with replacement and without ordering
83
84
Partition - 1
Department of Telecommunications Engineering
Partition
(Law of Total Probability)
85
Partition - 2
86
Partition - 3
...
B3
B1
Bn-1
...
B2
Bn
i.e.,
A =A S
= A (B1 B2 ... Bn )
= (A B1) (A B2) ... (A Bn )
k 1
P[A B] = P[B|A]P[A],
we may write probability as follows
P A P A B1 P A B 2 P A B n
P[
or equivalently
A B k ]
P A Bk
n
P[ A| Bk ]P[ Bk ]
k 1
k 1
87
88
Partition - 4
Example 20
Hence
P[ A] P[ A| Bk ]P[ Bk ]
k 1
k 1
k 1
P[ A] P A Bk P A| Bk P Bk
P[ A] P[ A| Bk ]P[ Bk ]
(c) Prof. Okey Ugweje
89
Federal University of
90 Technology, Minna
Example 20
Department of Telecommunications Engineering
P E P E | E1 P E1 P E | E2 P E2
P E | E3 P E3 P E | E 4 P E 4
0.50 0.3 0.30 0.25
0.10 0.25 0.02 0.20
0.254
Bayes Rule
92
Bayes Rule - 1
Bayes Rule - 2
Bayes Rule:
If the events B1, B2, , Bn constitute a partition of the sample
space S such that P[Bk] 0, k=1,2, , n, then for any event A in
S such that P[A] 0,
P Bk | A
P A Bk
P A| Bk P Bk
n
P[ A]
P[ A| Bk ]P[ Bk ]
k 1
Proof:
By definition of conditional probability
P A Bk
P Bk | A
PA
and then using partition law or total probability law for the
denominator, we obtain
P Bk | A
P A Bk
k 1
P[ A Bk ]
93
94
Markov Chains
Our brains are just not wired to do probability
problems very well.
Persi Diaconis
(c) Prof. Okey Ugweje
95
96
ij
P01
P10
PM 0
P11 P1M
PM1 PMM
Pij i = 0,1, 2,
j 0
The values
Pij P Xn 1 j| Xn i , i, j 0,1, 2,
P X n jn | X n 1 jn 1,, X 1 j1, X 0 j0 ,
Pjn i jn P X n 1 jn 1,, X 1 j1, X 0 j0 ,
P0 M
P00
97
Example 21
98
Example 21
Solution
The sample space of this experiment consists of sequences
of 0s and 1s.
Each possible sequence corresponds to a path through the
"trellis" diagram shown. The nodes in the diagram denote
the box used in the nth sub experiment, and the labels in the
branches denote the outcome of a sub experiment. Thus the
path 0011 corresponds to the sequence:
99
The coin toss was heads so the first draw was from box 0;
the outcome of the first draw was 0, so the second draw was
from box 0; the outcome of the second draw was 1, so the
third draw was from box 1; and the outcome from the third
draw was 1, so the fourth draw is from box 1.
Federal University of Technology, Minna
100
Example 21
Department of Telecommunications Engineering
Find P[0011] ?
Counting Techniques
But to us, probability is the very guide of life.
Bishop J. Butler
P 0011 P 1 |1 P 1 | 0 P 0 | 0 P 0
5 / 6 1 / 6 2 / 3 1 / 2
(c) Prof. Okey Ugweje
101
Counting Techniques - 1
102
Counting Techniques - 2
103
104
Counting Techniques - 3
Counting Techniques - 4
105
Counting Techniques - 5
106
Counting Techniques - 6
107
a11, a12 , ,
a1n1
a k1, a k 2 , , a knk
(c) Prof. Okey Ugweje
108
Counting Techniques - 7
Counting Techniques - 8
P( n,n ) n( n 1 )( n 2 )( n n 1 ) n !
P( n, k )
n!
( n k )!
P( n, k ) n ( n 1 )( n 2 )( n k 1 )
st
Total elements in the last experiment
Total elements in the 1
experiment
[ n( n 1 )( n 2 )( n k 1 )]( n k )!
( n k )!
109
Counting Techniques - 9
110
Counting Techniques - 10
n
n! ~ n
2n or n! ~ 2 nn 1/ 2en
e
n!
1
lim
n
n
n
2n
e
0! 1
ej
P 25, 2
ej
25!
25 24 600
25 2 !
112
Counting Techniques - 11
Counting Techniques - 12
Department of...Telecommunications Engineering
Multinomial Coefficient
Suppose n distinct elements are divided into k
different groups (k 2), for j = 1, , k, the j-th group
contains exactly nj elements where n1 + n2 ++ nk = n
We want to determine the number of ways in which
the n elements can be divided into k groups, i.e,
How many ways can k distinguishable balls be
distributed into n different boxes so that there are ni
balls in box i?
n
n1
n1
n n1
n2
n2
nk 1 nk
n n1 n2
n3
nk 1
n3
nk 1
Hence
n , n ,..., nk
P 1 2
n!
n1 !n2 !...nk !
n
n
n
n
,
,...,
k
1 2
113
Counting Techniques - 13
Definition: For any number x1, x2, , xk and any positive integer
n,
m
k k
n!
k
x1 x k n
x11 x22 xmm
i k1 ! k2 !km !
F n I F n I F n n I F n n n I F n I n!
GH k , k , , k JK GH k JK GH k JK GH k JK GH k JK k !, k !, k !
1
n!
n1 !, n2 !, , nk !
F n I
GH n , n , , n JK
1
114
Counting Techniques - 14
nk
nk
nk
n n n1 n n1 n2 nk 1 nk
N ( s)
n1 n2 n3 nk 1
115
n!
n1 !, n2 !, , nr !
116
Counting Techniques - 15
Counting Techniques - 16
FG IJ
HK
n
P(n, k )
n!
Ck C(n, k )
n
k
k!
k !(n k )!
(c) Prof. Okey Ugweje
117
xxxx
URN
xx
1
xxxxxx
2
...
x
n
N ( S)
Experiment involve how many ways to put stars and bars in order
118
Random Variable
The degree of understanding a phenomenon is
inversely proportional to the number of variables
used for its description
- Unknown Physicist
(c) Prof. Okey Ugweje
119
sj
s21
X(t;s i )
s4
s1
Set A
s10
sk
s5
s15
Interval I
P X I = P[ A]
= s1, s2,, s is the set of outcomes
k
s50
HT
TT
0
A mapping of
S = {HH, HT, TH, TT}
into the real line
R
1
x
set A S maps to I R1
si
TH
HH
Definition:
Suppose that (S, F, P) is a probability space in
which S is not necessarily countable. A Random
Variable, X, defined on this space is a function from
S into the real line such that the set {|X() x} F
for every real x
A Random Variable, X, defined on the probability
space is a function that assigns real value number
X() to every random outcome S
Translated, a Random Variable is a real value
function that associate a real number with each
element in the sample space
(c) Prof. Okey Ugweje
Example 25
Department of Telecommunications Engineering
Note:
The function that assigns value to each outcome is
fixed and deterministic, e.g., number of heads in
three tosses of coin
However, the outcome of the experiment is not
known
No matter how careful a process is run, an
experiment is performed, or a measurement is
taken, there will be variability when the action is
repeated
If the outcome is already a numerical value, then we
can make the assignment X() =
(c) Prof. Okey Ugweje
126
Example 26 (21)
Department of Telecommunications Engineering
Example 27 (22)
Department of Telecommunications Engineering
Continuous
P[ X x],
FX ( x)
P X xk u ( x xk ), Discrete
k
131
132
FX ( x) P[ X x]
FX(x)
1
1/2
1/6
Continuous x
pf
Discrete S x 0, 1, 2
Properties of CDF - 1
1) 0 FX ( x) 1, - x (from Axiom I and Corollary 2)
7) P[ X b]
if FX (x) is continuous at b
1
3/4
1/4
1
5) FX ( x ) is continuous from the right, i.e., for any b, and for h > 0
FX (b) = lim FX (b h) FX (b )
8) P[ X x] 1 FX ( x)
h0
134
Properties of CDF - 2
Example 28 (23)
Continuous
P[ X x],
FX ( x)
P X xk u ( x xk ), Discrete
k
1) 0 FX ( x) 1
2) FX () 1
3) FX ( ) 0
8) P[X > b] =1 FX ( x)
(c) Prof. Okey Ugweje
0,
1
,
4
( x)
3,
4
1,
137
Given that
1 e 2 , x 0
FX x
0,
x0
(3) FX 1e 1
(4) FX x1 FX ( x2 ), x1 x2
FX x
0 x1 # of heads = 0
1 x 2 # of heads is at least 1, [1,2]
x2
# of heads 2
Example 29 (24)
Department of Telecommunications Engineering
(5)
# of heads is < 0
x0
X ( x)
FX x P X x P X x
FX ( x ) P X x P X x
Example 30 (25)
R| 0
x
F ( x) S
|T161,
4
x0
0 x 2
2 x
FX(x)
1
3
3
F P[ X ]
2
2
3
1 3 4
1 1 4
1
3
1
P X F F
X 2
X 2 16 2
2
2
16 2
138
Example 31 (26)
1) P a X b FX b FX a
1 e x , 0 x
FX x
else
0,
2) P a X b FX b FX a P X a
3) P a X b FX b FX a P X a P X b
4) P a X b FX b FX a P X b
5) P a X 1 FX a
6) P X a 1 P[ X a ] 1 FX a
14 1 e
1
4
0.2212
141
fX
dFdx( x ) ,
( x)
d
dx P X xk u x x , p ( x ) x x ,
143
continuous
discrete
144
Properties of PDF
1) 0 P[ X xk ]
1) 0 f X ( x)
2)
2) P[ X x k ] 1
f X ( x)dx FX () 1
3) FX ( x)
Discrete PDF:
f X ( x)dx
3) F ( x) P[ X x k ]
4) P[a X b] P[ X x k ]
k a
P[ X a] P[ X a] F(a)
(c) Prof. Okey Ugweje
f (x)dx
Example 32 (27)
Department of Telecommunications Engineering
x 0
else
146
z f xdx
2) P a X b z f xdx
3) P a X b z f xdx
4) P a X b z f xdx
5) P a X z f xdx
1) P a X b
Solution
1
x dx 01 xdx 01 xdx
0
x2
x2
2 1 2 0
1
145
Note
for any real number a, a- < a < a+, with a-, a+
arbitrarily close to a
148
Conditional CDF
Example 33 (28)
Department of Telecommunications Engineering
Conditional Distribution
From the definition of conditional probability, we obtain the
definition for conditional CDF
P[ X x B]
P[ A B]
FX ( x|B) P[ X x| B]
P[ B]
P[ B]
where A is the event {X x}
Solution
First find C
P[ A|B]
1
ce x dx 0 ce x dx
ce x dx
2 0 ce x dx
Properties:
1) 0 F( x| B) 1
c x
2
e
c
2
0
P X v
v e
2
1 e v
2) F(| B) 1
3) F(| B) 0
4) F( x) is non-descreasing F(a| B) F(b|B), if a b
5) F( x) is continuous from the right, i.e., F( x | B) F( x|B), if a b
dx 2 0v e x dx
2
6) P[ x1 X x2 | B] F( x2 | B) F( x1| B), if x1 x2
150
Conditional PDF
Department of Telecommunications Engineering
Conditional Density
From the definition of conditional probability, we obtain the
definition for conditional CDF & PDF.
f X ( x| B) dFX ( x| B)
dx
Properties
3) F ( x|B)
f ( y| B)dy
4) P[ x1 X x2 ] F ( x2 |B) F ( x1| B)
Bernoulli RV
Binomial RV
Negative Binomial RV
f ( x|B)dx FX () 1
x2
x1
f ( y| B)dy
Poisson RV
Hypergeometric RV
Zeta RV
152
Repeated Trials - 1
Bernoulli Trial - 2
**
P ( A1 A2 An ) P1 ( A1 ) P2 ( A2 ) P ( An ).
P(X 0 X1 X n)
153
Bernoulli Trial - 3
k 0
P(X k)
k p
k 0
q nk .
154
Bernoullis Theorem - 1
Suppose for a given n & p we want to find the most likely value of k?
From Fig. below, the most probable value of k is that number which
maximizes Pn(k).
pq
Pn (k )
n 12,
p
P
n
p 1 / 2.
( n k 1)! ( k 1)! n! p k q n k
Pn ( k )
n k 1 p
Thus
if
or
Pn ( k ) Pn ( k 1 ),
k (1 p ) ( n k 1 ) p
k ( n 1) p .
k ( n 1) p
if it is an integer, or the largest integer kmax less than (n+1)p.
The equation **** represents the most likely number of successes (or heads)
in n trials.
(c) Prof. Okey Ugweje
155
(
)
p k q n k
p
q
k
P
k
k
n
( n k )! k!
k 1
k 1 ( n k )! ( k 1)!
k 0
n 1
n 1
( n 1)!
n!
p i q n 1i
p i 1q n i 1 np
(
1
)!
!
(
)!
!
n
i
i
n
i
i
i 0
i 0
np ( p q ) n 1 np .
(c) Prof. Okey Ugweje
156
Bernoullis Theorem - 2
Bernoullis Theorem - 3
(
)!
(
1
)!
(
)!
(
2
)!
n
k
k
n
k
k
k 1
k 2
k 0
( k np )
k 0
Pn ( k )
n!
p k q n k n 2 p 2 npq .
k 1 ( n k )! ( k 1)!
n
k 0
n
k 0
2 Pn ( k ) n 2 2 .
k 0
Pn ( k )
k 0
p
P
n
#*
Pn ( k ) 2 np k Pn ( k ) n 2 p 2
k 0
157
Bernoullis Theorem - 4
Pn ( k ) n 2 2
k np n
Pn ( k )
Pn ( k )
pq
.
n 2
158
Symmetry
Pascals Triangle
FG nIJ FG n IJ
H kK H n kK
Factorial
FG nIJ n FG n1IJ
H k K k H k 1K
Addition
Product
n k n k
k x y
k np n
(k np )
( k np )
Pn ( k )
n 2 2 P k np n .
is equivalent to ( k np ) 2 n 2 2 ,
( k np ) 2 Pn ( k )
( k np )
k np n
k np n
Note that
k
p
n
(k np )
j 1
159
n 1
n 1
j k 1
j k 1
FG nIJ FG nIJ 1
H 0K H nK
FG nIJ FG n IJ
H r K H n rK
FG nIJ FG n IJ FG n1IJ , 1 r n
H r K H n rK H r K
FG 0IJ
H 0K
FG1IJ
FG1IJ
H 0K
H1K
FG 2IJ
FG 2IJ
FG 2IJ
H 0K
H1 K
H 2K
FG 3IJ
FG 3IJ
FG 3IJ
FG 3IJ
H 0K
H1K
H 2K
H 3K
Each row begins and ends with a
160
I X ( )
RS1,
T0,
k p
x
x
S X 0, 1
1-p
p
B1,p
R|
S|
T
RS
T
FH IK
161
FG nIJ p (1 p)
H kK
k
nk
, k 1, 2,
pk ( x) P[ X k ] p(1 p) k 1, k 1, 2,
1-p
1
(c) Prof. Okey Ugweje
162
1-p
p
...
1
1-p
p
1-p
p
G1,p
Bn,p
163
164
R|F k 1I p (1 p)
p ( x) P[ X k ] SH r 1K
|T0,
r
k r
, k r, r 1,
1-p
...
p
1-p
p X ( k ) P[ X k ]
1-p
p
Gn,p
165
k
n l arg e, p small, = np e
k!
e1 n jk e , e1 n jn k 1, n(n 1)k(n k 1) 1
n
Hence
P[ X k ]
n k
nk
P[ X k ] p (1 p )
k
166
Proof:
n! k
1
n k !k ! n
n
k e
k!
nk
f X ( x) e
k 0
1
n ( n 1) ( n k 1)! k n
k ! k
nk
k
xk
FX ( x) e
1
n
, k 0,1, 2,...
It is assumed that
= average number per unit of time
FG nIJ p (1 p)
H kK
k!
k 0,1, , r 1
k 0
167
k!
k
u xk
k!
168
169
pk ( x) P[ X k ]
where
C
k
, k 1, 2,
LM F 1I OP 1
N H kK Q
1
k 1
170
FH
I
a b K
FG aIJ FG b IJ
X ~ ha x; n, a, bf H xK H n xK
FG a bIJ
HnK
P ( X 0) q,
P ( X 1) p .
2. Binomial: X ~ B(n,p)
P(X k)
n
P ( X k ) p k q n k ,
k
k 0 ,1 , , n .
12
3. Poisson: X ~ P()
P ( X k ) e
FG aIJ FG b IJ
p ( x) P[ X k ] H xK H n xK , k 0,1,, a
FG a bIJ
HnK
k
k!
, k 0 ,1 , 2 , , .
P(X k)
171
172
4. Hypergeometric:
P( X k )
m
k
N m
n k
,
N
n
max(0, m n N ) k min( m, n )
5. Geometric: X ~ g(p)
P ( X k ) pq k , k 0 ,1 , 2 , , ,
q 1 p.
k r , r 1, .
7. Discrete-Uniform:
P(X k)
Richelle E. Goodrich
1
, k 1, 2 , , N .
N
173
An uniform RV is given by
Uniform RV
Gaussian (Normal) RV
Cauchy RV
Rayleigh RV
Nakagami RV
Beta RV
Chi-squared RV
Pareto RV
Exponential RV
Gamma RV
Laplacian RV
Rician RV
Weilbull RV
Log-normal RV
fX
R| 1 ,
( x) S b a
|T0,
R|0,
x a
F ( x) S
,
|T1b, a
Erlang RV
f X ( x)
a x b
1
ba
otherwise
a
xa
a x b
b x
175
FX ( x)
1
a
R|e(x a) ,
S| 0,
T
R|1 e(x a) ,
F ( x) S
T|0,
f X ( x)
f X ( x)
xa
xa
P[ X s t | X t ] P[ X s ]
xa
FX ( x)
Proof:
From conditional probability definition, one obtains
xa
P[ X s t | X t ]
R|
( x a)
1
f ( x) S ( x a)e 2 ,
|T0,
R| (x a)
F ( x) S1 e 2 , x a
|T0,
xa
2
R| x L x O F I
f ( x) S expMN 2 PQ I H xK ,
T|0,
2
xa
xa
f X ( x)
a=0
The Rayleigh RV with parameter =1 corresponds to the Chisquared with 2 degree of freedom
The square of Rayleigh RV with parameter corresponds to the
exponential RV with parameter 1/(2)
The Rayleigh PDF and CDF are commonly used in
communication
(c) Prof. Okey Ugweje
178
P[ X s t ]
P[ X t ]
P[ X s ] P[ X t ]
P[ X t ]
P[ X s ]
P[ X s t , X t ]
P[ X t ]
179
x0
x0
FH a , x IK
b b
FX ( x ) 1 Q
2
2
Q , x exp x I dx
2
(c) Prof. Okey Ugweje
180
R| 2 e mj x
S|0, m
T
m
f X ( x)
2m 1
LM
N
exp
OP
Q
m 2
x , x0
x0
f X ( x)
/
x
2
, x
f X ( x)
var( x)
2
2
FX ( x) P[ X x]
1 1
x
tan 1
m = 1 Rayleigh PDF
m = 0.5 One sided Gaussian
Also known as m-distribution
Federal University of Technology, Minna
181
182
R| x
S|0,
T
f X ( x)
1 x
, x 0, 0, 0
f X ( x)
1 x
f X ( x)
x e , x0
x0
where
x 0 x 1e x dx,
x n 1 k k
FX ( x ) 1e
k! x
k 0
FX ( x ) G
x, 1 Incomplete Gamma Function
Note that
e21j
1
m1 m!, m 1, 2, m m1!
(c) Prof. Okey Ugweje
183
184
f X ( x) e x 1 , x
2
R|21 e (x a) ,
F ( x) S
|T1 21 e (x a) ,
X
fX
x a
a x
R
|
( x) S
|T0,
FH x a IK
x a 1 exp
R|1 expLF (x a) I O,
MN H K PQ
F ( x) S
|T0,
(x)
, x a, 0, 0
xa
xa
xa
x
(c) Prof. Okey Ugweje
185
186
fX
a f
R
x
|
( x ) S a f
|T0,
R a f
F ( x) S
T0,
I X , ,
1 x 1 ,
0 x 1
f X ( x)
otherwise
x 1
n 1
x
x 2 exp , x 0
n
2
2n / 2
2
f X (x)
f X ( x)
FX ( x) G n , x
2 2
x 1
0
2 (n) G n2 , 2
(c) Prof. Okey Ugweje
187
188
Pareto Variable
Pareto Variable
f X ( x)
21 m n mm / 2 n n / 2 x m / 2 1
e 21 j e 21 j
m n
mx n
mn
2
f X ( x)
189
R| 1
S|0, x
T
1 ,
x
otherwise
190
F
e2 j H
2
1
x
1 2 1
1
I
K
fT ( t )
191
192
Gaussian (Normal) RV - 1
95% within
2 standard deviations
68% within
1 standard deviation
34%
Score
34%
2.4%
2.4%
0.1%
0.1%
13.5%
-3
Federal University of Technology, Minna
193
+2
+3
194
Gaussian (Normal) RV - 4
Department of Telecommunications Engineering
2 2
X ~ N , 2
3
1
2
b g
0607
. a
2
Gaussian curves with 1 2 and 1 2
f X ( x)
-2
Gaussian (Normal) RV - 3
Department of Telecommunications Engineering
f X ( x)
13.5%
x
195
1 2
Gaussian curves with 1 2 and 1 2
2
x
It is the most important of all densities and models more different random
occurrences than any other PDF
The most widely used model of noise in communication systems
(c) Prof. Okey Ugweje
196
Gaussian (Normal) RV 6
Gaussian (Normal) RV - 5
197
Gaussian (Normal) RV - 7
198
Gaussian (Normal) RV - 8
FX ( x) P[ X x]
x
(t ) 2
exp
dt
2
2
2
1
=1
=0
0
(c) Prof. Okey Ugweje
199
200
Gaussian (Normal) RV - 10
Gaussian (Normal) RV - 11
Hence,
X ~ N ( , 2 )
( x)
X ~ N (0, )
2
X ~ N (0,1)
1
2
F t I dt
H 2K
exp
(x)
( x)
LM
N
201
Hence
OP F I
Q HK
x 2
a
exp
dx
2 2
2 2
1
202
Gaussian (Normal) RV - 13
2
a exp x dx
2 2
2 2
x
Q( x)
Q( x) 1 x
Q( x) 1 Q x
Q(0) 1
2
exp
y
( dy )
2
2
y2
1
a
exp
dy Q
a
2
2
1
2
z FH y2 IK dy
exp
x
Q function
Q( x) 1 FX ( x)
Gaussian (Normal) RV - 12
FX (a ) P[ X a ]
FX (a )
FX (a) P[ X a]
X a
a
P
x 1 x
1
2
exp y dy
2
2
x
y2
x
1
exp dy
2
2
Also
FX ( x)
203
204
Gaussian (Normal) RV - 14
Gaussian (Normal) RV - 15
F a I
HK
2) P X a 1 P X a 1 Q
i 1
3) P a X b FX b FX a P X a P X b
or
a
b
Q
P a X b Q
F I F I
HK HK
a I
4) P X a 1 P X a 1 QF
HK
2
N (0,1) ~
i 1
The ratio of two independent unit Gaussian RV, N(0,1), is the standard Cauchy
The sample mean of n-independent and identically distributed RV each with
mean m and variance 2, tend to be Gaussian distributed with mean m and
variance 2/n, as n 0
Federal University of Technology, Minna
205
Lognormal RV
Example 35 (29a)
R| 2
L ln(x a) b OP , x a
f ( x) S 2 x a expMN
Q xa
2
|T0,
R| 1 L ln(x a) b O z x expF z I dz, x a
PQ H 2 K
F ( x) S 2 MN
2
|T0,
xa
2
207
206
Example 35 (29a)
(a)
2 3 X 3 5 3
1 X 3 2
P 2 X 5 P
3
3
3
3
3
3
1
2
P z
3
3
Q x 1 x
2
1
3
3
2
1
2
1
1 1
3
3
3
3
b) P X 0 P X 3 0 3 P z 1
3
3
1 1 1 1 1 1
0.8413
c) P X 3 6 P X 3 6 P X 3 6
P X 3 P X 9
X 3 3 3
X 3 9 3
P
3
3
3
3
P z 2 P z 2
2 1 2
1 2 1 2 2 1 2
2 1 0.9772 0.0456
Example 36 (29b)
Department of Telecommunications Engineering
5
5
5
v2
6
P 1 z
1 Q 1 Q 1
5
Q 1 Q
5
1 Q 1 Q 1.2
1 0.1587 0.1151 0.7262
(c) Prof. Okey Ugweje
Statistical Properties of
Random Variables
Rowes Rule: the odds are six to five that the light
at the end of the tunnel is the headlight of an
oncoming train.
Paul Dickson
211
212
Statistical Properties of RV
213
- xf X ( x)dx, continuous
x m x X E[ X ] n
xk p ( xk ), discrete
k =1
Expectation of a RV - 2
214
Expectation of a RV - 3
f ( x4 )
f (x3)
f ( x5)
f ( x2 )
f ( x1)
EX
EX
x1
x2
x3
(a)
x4
x5
(b)
215
xf X ( x ) dx x f X ( x) dx
i.e., only when the integral converges absolutely
In general, if fX(x) has one peak @ X = x1 and is
symmetric about x1, then E[X] = x1, else the mean
value do not necessarily lie @ X = x1
Note that the notation E[X] is not a function of X
(c) Prof. Okey Ugweje
216
Expectation of a Function
Expectation of a RV - 4
Department of Telecommunications Engineering
1) E c c, c isaconstant
E Y E g ( x) ?
2) E cX cE X
3) E X c E X c
First find the PDF of Y and then use the definition to find
E[Y], or
Calculate the expectation directly using the definition as in
4) E X Y E X E Y
5) E X E Y , if P X Y 1
6) E[ X ] E X
- g ( x) f X ( x)dx
g ( x) E[ g ( x)]
g ( xk ) P X xk
k
7) E X 1 X 2 X N E[ N ] E X
217
218
Example 38 (30b)
Example 37 (30a)
Department of Telecommunications Engineering
E X x
xf X ( x) dx
2
1 x
ba 2
1 b
xdx
ba a
fX x
0,
1 1
1 1
b 2 a 2
2 b a b a b a
2 ba
E X
3
1 b 2
1 x
x f X ( x) dx
x
dx
ba a
ba 3
10
1
dx
10
010 x
x2
5
20 0
10
else
1
10
E X
xf X x dx
1 1
b3 a 3
3 ba
0 x 10
b a
1 1
1 1
b 2 a 2
b a b a
2 ba
2 ba
2
Also
fX(x)
101 ,
219
220
Example 39 (31)
Example 40 (32)
Kebx , x 0
f X ( x)
x 0
0,
bx
f X ( x)dx 0 Ke dx 1
K
e bx 1 K b
0
b
Solution
5
E X x xk p xk
k 1
E X 0 xbe bx dx
xb e
bx
0
xb e bx
(c) Prof. Okey Ugweje
0 e bx dx
1
b
e bx
6.85
221
m X
2
X
- x 2 f X ( x)dx, continuous
X E[ X ] n 2
discrete
xk p ( xk ),
k=1
2
222
E X n X n - x n f ( x )dx
E[ X x ] X x
- X x
f ( x)dx
X
X RMS E[ X 2 ]
(c) Prof. Okey Ugweje
223
224
Example 41 (33)
Example 41 (33)
f X ( x ) e x , x 0
Solution:
2
E X 2
x f X ( x)dx
E X
3 b
1 b 2
1 x
a x dx
ba
ba 3
1 1
b3 a 3
3ba
x m e ax m m 1 ax
x e dx
a
a
x 2e x
2 x
E X
0 xe dx
0
2
2 x
0 x e dx
2 x 2 e x
1 x
e dx
0 0
225
E[ X x ] X x
n
X x f X ( x)dx
n
Special Cases:
when n = 1, the first central moment is zero
when n = 2, the 2nd central moment is called the variance, i.e., the
variance is the second central moment
var[ X ] X2 X x E[ X x ]
2
E X 2 xE[ X ] xE[ X ] E[ xx ]
X2 X X X X X X
x2 var X E[(X x ) 2 ]
X2 X X
- (X x ) 2 f X ( x) dx
continuous
2
g (X k x ) P X xk discrete
k
E[ X ] E[ X ]
2
Standard deviation:
E X 2 Xx Xx xx
226
2 1 x
2
e 2
var[ x]
227
228
Properties of Variance - 1
Properties of Variance - 2
Proof:
Suppose that n = 2,
E[X1] = 1, E[X2]= 2, then E[X1+X2] = 1 + 2
var X 1 X 2 E[ X 1 X 2 1 2 ]
2
If E X , then E aX b a b
E[ X 1 1 X 2 2 2 X 1 1 X 2 2 ]
2
E[ X 1 1 ] E[ X 2 2 ] 2 E[ X 1 1 X 2 2 ]
2
Var aX b E aX b a b
E[ X 1 1 X 2 2 ] E[ X 1 1 ]E[ X 2 2 ] ( 1 1 )( 2 2 ) 0
a f
Hence
229
Example 42 (34)
Department of Telecommunications Engineering
b a
E X 2
1 1
b3 a 3
3 ba
The variance is
ba
Vax X E x 2 E x
b a 1 1 b3 a3 2
3 b a
1 b
ab 2
dx
a x
2
b 1
3
1 1 b a a b
3 b a 8
8
You can also use the brute force method shown below
2 E x 2 E x
1 1
ab
x
3 b a
2
3b
1 b
ab
2
2 Var X E x x
a x
dx
230
Example 42 (34)
Department of Telecommunications Engineering
EX x
var[ X 1 ] var[ X 2 ] 2 E[ X 1 1 X 2 2 ]
E aX a 2 a2 E X 2 a2Var X
3
1 1 b a
3 b a 4
231
b a
12
Federal University of Technology, Minna
232
Example 43 (35)
2
2
X2 E X x 0 X x f X ( x)dx
Kebx , x 0
f X ( x)
x0
0,
0 X b1 be bx dx
2
b 0 x 2 e bx dx b2 0 xe bx dx b12 0 e bx dx
b xb e bx 2b2x e bx b23 e bx
0
2
0
b
2
b2
xe
bx
b13 e bx b b13 e bx
0
0
233
234
235
236
Characteristic Function - 1
Characteristic Function - 2
X ( ) E e j X
f X ( x)
- e j X f X ( x)dx, continuous
j X
e k p X ( xk ), discrete
k
X ( )e
jX
237
z x(t)e j2ft dt
z X ( f )e j2ft df
x(t ) F 1 X ( f )
E X
xf X ( x)dx
z
z
E X
2
E X
x f X ( x)dx
x 3 f X ( x)dx
E X
n
x n f X ( x)dx
Characteristic Function - 3
238
Characteristic Function - 4
X ( )
jxf X ( x)e j X dx 0
d
0
E X
E X
2
2
d2
( )
jx f X ( x)e j X dx 0
d 2 X
0
Hence,
j 2 x 2 f X ( x )dx j E X 2
LM
N
OP
Q
1 d
X ( )
j d
0
jxf X ( x )dx jE X
X ( f ) F x(t )
f X ( x) X ( )
1
2
LM
N
LM
N
OP
Q
1 d
2
2 X ( )
j d
0
OP
Q
d
E X n 1n
X ( )
n
0
j d
n
dn
( )
jx f X ( x)e j X dx 0
d n X
0
j n x n f X ( x )dx j E X n
239
240
Example 37
Department of Telecommunications Engineering
|Rz e
S
|T e
e x , x 0
f X ( x)
x0
0,
M X (t ) E e
tX
-
n
tX
tX k
k 1
Hence,
discrete
t 2 E[ X 2 ] t 3 E[ X 3 ]
2!
3!
M X (0) 1
M (0) E[ X ]
n
X
P X xk
f X ( x)dx, t 0 continuous
241
X M X (0) M 1X (0)
2
242
Example 38
LM d
Ndt
E X
n
OP
Qt 0
M X (t )
Property:
Y aX b MY (t ) ebt M X at
M Y (t ) E e yt E et aX b E ebt e atX ebt E e(at ) X
ebt M X at
243
244
d2
E X 2 2 GX ( z )
P X x x( x 1) z x 2
dz
z 1 x 0
z 1
GX (z) E z X pX ( xk )z x
x( x 1) P X x E[ x( x 1)] E[ X 2 ]
x0
x 0
dn
E X n GX ( z )
E[ x( x 1) ( x n 1)]
dz
z 1
n
P X k zk 1
E X GX ( z )
dz
z 1 k 0
z 1
xP X x E[ X ]
x 0
245
LM
N
OP
Q
d2
d
d
GX ( z ) GX ( z )
GX ( z )
dz 2
dz
dz
z 1
z 1
z 1
246
Laplace Transform
Example 46
Department of Telecommunications Engineering
LX ( s ) E e sX 0 e sX f X ( x)dx
LM
N
E X (1)
n
dn
dz
L X (s)
OP
Q
s0
LX (s)
247
248
Example 47
Department of Telecommunications Engineering
Tail Inequalities
It is always better to be approximately right, than
precisely wrong.
- Unknown Engineer
249
Tail Inequalities - 1
250
Tail Inequalities - 2
1. Markov Inequality:
If X is a RV that takes nonnegative values, then for any value k
>0
PX k
Proof:
E X
k
I st order bound
b xf X ( x)dx b kf X ( x) dx
kP[ X k ]
Hence,
E X P X k
251
252
Tail Inequalities - 3
Tail Inequalities - 4
2. Chebyshevs Inequality:
Chebyshev Inequality (CI) gives a conservative estimate of
the probability that a random variable X assumes a value
within standard deviation of its mean,
Let X be a RV with mean and variance 2. Then for any
value k > 0 at most 1/k2 of the probability is distributed
outside the interval -k2 < X < +k2 . That is
P X k
Proof:
Chebyshev Inequality is a consequence of the Markov
Inequality
P X k2
2
a f
E X
k2
E X
2
k
2
k
P X k 1 2
253
Tail Inequalities - 5
Department of Telecommunications Engineering
or P X k 1
2
or
P X k 1
254
Tail Inequalities - 6
Department of Telecommunications Engineering
3. Chernoff Inequality:
If X is a RV and for any value k > 0
k2
P e k
tX
M X t
k
R|P X k ekt M
S| P X k ekt M
T
(t ), k E[ X ]
(t ), k E[ X ]
255
256
P lim n 1
n
257
lim P n 1
n
Let X1, X2, , XN, be a sequence of iid RVs, each with finite
mean and finite variance 2
Let Sn = X1 + X2+ + Xn, n > 1, and let Zn be a sequence of
unit variance, zero mean RVs, defined as
Z Sn n
n
n
Then,
x2
1 z 2
dx
lim P Z n z N 0,1
e
n
2
258
259
260
g(.)
g: X Y
261
SX
(S, F, P)
Edward Gibbon
Federal University of Technology, Minna
Transformation of a
Random Variable
Y
fY(y)
SY
(S, F, P)
262
FY ( y) P[Y y]
P[ g( x) y] P[ X g 1 ( y)]
1
FX g ( y )
Hence
Steps:
1) Solve for x in the given equation in terms of y
2) Substitute into the above equation
X g ( y)
1
FY ( y) FX g ( y)
263
264
Y aX b
y b
a
y b
y b
F a yf P L X
MN a OPQ F FH a IK
Y
R|F FH y bIK ,
a
F a yf S
F
|T1 F H yabIK ,
X
Case 2: a < 0
a0
Yy
y b
X
a
x y b
a
a f LNM
OP LM
Q N
F
IK
1 F H
FY y P X
X
y b
y b
P
X
a
a
y b
a
OP
Q
a0
y b
a
Yy
265
266
2. Square Function: Y = X2
X
fX(x)
X y
g(x)
Y
fY(y)
Case 1: y 0
n
f X ( xk )
f Y ( y) d
k 1
g ( x)
af
FY y P y x y FX
dx
b y g F b y g
Case 2: y < 0
There is no value of X for which x2 <y. Hence FY y P 0
af
RF b y g F b y g,
F a yf S
T0,
X
a 0
x x dx
yx
a 0
dx
fY ( y) f X ( x) f X ( x) f X ( x)
d
dy
dy
g( x)
dx
y 0
y 0
dx
Steps:
1) Given y = g(x), solve for x in the given equation in terms of y
2) Find d
dy
dx
g ( x)
dx
267
268
dx
dx
X
f Y ( y)
1 y
f
,
f
f y 2 ay X a X a
Y
0,
FH IK
y
a
x y
x y
269
xo = cos ( y), 0 x
1
y
1
af
2
1
cos ( y)
2 cos 1 ( y)
af
d 2 cos 1 y
f X 2 cos 1( y)
dy
1
f X cos 1( y) fX 2 cos 1( y)
1 y 2
f X ( x) 1
d cos 1 y
fY ( y) f X cos 1( y)
dy
LM 1 1 OP
N2 2 Q
R|0,
sin ( y)
F ( y) S 1
,
2
|T1,
1
y 1
1 y 1
y 1
Y a sin x
1 y2
Federal University of Technology, Minna
1
1 , 1 y 1
2
1 y 2
1 y
By integration
-1
270
x1 = 2 cos1( y), x 2
2
1
1
2 2
g
g
I
K
1
fY ( y) 1 f X ln y
a
y
b
b
F
H
dx
1
1
ln y ax x ln y
a
dy ay
FG JI
HK
a
fY ( y) a2 f X
y
y
1
a
dy
a
g' ( x )
2 y2
y
dx
x
a
dy d 2
y
aX 2 ax 2 a
2 ay ,
dx dx
a
1
dx
dy 2 ay
y0
y b
1
fX
a
a
y0
271
x0
x1
x2
x3
x4
272
fY ( y)
1
2
a y
f X ( xn ), y a
fY ( y) 1 2 f X ( xn )
1 y n
David Hume
273
When there are more than one RV, we talk about joint
events from the same sample space
Any ordered pair of numbers (x, y) can be considered
as a point in the xy plane
Y
S
S1
X(s2), Y(s1)
Y
S2
SJ
k p
X x Y y X x, Y y
Federal University of Technology, Minna
A
A B
Let A = {X x} and B = {Y y}
Events A and B refer to the sample space S, while events {X
x} and {Y y} refer to the joint sample space SJ
(c) Prof. Okey Ugweje
274
275
k p
B Yy
276
277
278
P X x, Y y , continuous
F (x,y ) =
XY
discrete
P X x, Y y ,
FXY(a,b) is the probability that X and Y lie in the semi-infinite
region of the (x, y) plane
Joint CDF
y
b
Properties:
The properties of joint CDF is similar to that of the single variable
(c) Prof. Okey Ugweje
279
280
1) 0 FXY ( x, y) 1
2) FXY ( x, y) is a nondecreasing function of both x and y
P[ X a, Y ]
P[a1 X a2 , b1 Y b2 ], a1 a2 , b1 b2
(a2,b2)
b1
(a1,b1)
(a2,b1)
a2
a1
(c) Prof. Okey Ugweje
281
282
1) P[ X a, Y b] FXY (a , b )
6) P[a1 X a2 , b1 Y b2 ]
=FXY (a2 , b2 )+FXY (a1 , b1 )-FXY (a1 , b2 )-FXY (a2 , b1 )+P X =a1 , b1 <Y b2
2) P[ X a, Y b] 1 FX (a )-FY (b )+FXY (a ,b )
3) P[a1 X a2 , b1 Y b2 ]
where
P X a, b1 Y b2
lim FXY (a 1n , b2 )- lim FXY (a 1n ,b1)- lim FXY (a 1n ,b2 ) lim FXY (a 1n , b1)
7) P[a1 X a2 , b1 Y b2 ]
b2
a2
a1
Note:
The first 5 properties are just the 2-dimensional extension of
properties of one random variable
Properties 3, 4, and 5 may be used to test whether a given
function is a valid joint CDF
As in the case of a single RV, joint CDF can be used to compute
probabilities of unions and intersection of semi-infinite rectangles
P[ X , Y b]
7) FXY (, y) FY (y)
4) FX (,) 1
b2
6) FXY ( x,) FX ( x)
FXY (a2 , b2 )+FXY (a1 , b1 )-FXY (a2 , b1 )-FXY (a1 , b2 )-P Y b2 , b1 <Y b2
b1
5) P[ X a, b1 X b2 ] FXY (a , b2 ) FXY (a , b1 )
(c) Prof. Okey Ugweje
283
284
RSF (x,)
Tz z f (, y)ddy
Joint PDF
XY
XY
F (,y)
XY
F (y) =
Y
f (x, )d dx
XY
285
Properties:
f XY ( x, y) = d FXY ( x, y)
dxdy
It is assumed that X & Y are jointly continuous, else
the derivative may not exist
It follows that
XY
( x, y) =
zz
x
286
1) f XY ( x, y) 0
2)
z z
f XY ( x, y)dxdy FX () 1
z z
4) F ( x) z z f
5) F ( y) z z f
3) FXY ( x, y)
x
XY
f XY ( , )dd
XY
( , )dd
( , )dd
6) P a1 X a2 , b1 Y b2 bb2 a 2 f XY ( x, y)dxdy
a
f XY ( , )dd
7) f X ( x)
f XY ( x, y)dy
8) f Y ( y) f XY ( x, y)dx
Note:
Properties 1 and 2 are sufficient to test the validity of joint
PDF
(c) Prof. Okey Ugweje
287
288
zz
f XY ( x, y)dxdy
z z f ax, yfdxdy
2) P a X b, c Y d z z f a x, yfdxdy
3) P a X b, c Y d z z f a x, yfdxdy
4) P a X b, c Y d z z f a x, yfdxdy
5) P a X b, c Y z z f a x, yfdxdy
6) P X a, c Y d z z f XY a x, yfdxdy
1) P a X b, c Y d
b d
a c
XY
b d
a c
XY
Joint PMF
b d
a c
XY
b d
a c
XY
a c
XY
a d
a c
289
290
XY
(x,y ) =P X x, Y y
FXY (x , )
(x ) = x
f XY ( ,y )d dy
F (,y)
XY
F (y) =
y
Y
f (x, )d dx
XY
z
z
dx
f Y ( y) = f XY ( x, y)dx = d FXY (, y)
dy
291
292
1. Independence of X and Y
FXY ( x, y ) FX ( x) FY ( y )
f XY (x ,y ) = f X (x )fY (y )
Implies that if X and Y are independent, their jpdf and jcdf factor into
2 marginal densities or distributions, respectively
Also
(c) Prof. Okey Ugweje
293
294
ij X iY j E[ X iY j ]
E[ Z ]
zf Z ( z )dz
E[ X i ]E[Y j ]
Thus
- g ( x, y ) f XY ( x, y )dxdy, continuous
E[ g ( x, y )]
discrete
g xi , y j p XY xi y j ,
n k
i j
- x y f XY ( x, y )dxdy, continuous
i j
discrete
xn yk p XY xi y j ,
n k
P X x, Y y P X x P Y y
295
296
Correlation of X and Y - 1
Correlation of X and Y - 2
11 RXY E[ XY ]
- xyf XY ( x, y )dxdy
RXY E[ X ]E[Y ]
ij E[ X X Y Y ]
i
Note:
When RXY = E[X]E[Y], X and Y are said to be
uncorrelated
Independence Uncorrelatedness
Uncorrelatedness (not always) Independence
(c) Prof. Okey Ugweje
j
i
- x X y Y f XY ( x, y )dxdy
297
298
Conditional PMF - 1
Discrete
The joint conditional pmf of X given that Y = y is given
by
P A|B
P A B
PB
P X xi , Y y j
P Y y j | X xi
P X xi |Y y j
299
P Y yj
P X xi , Y y j
P X xi
pXY xi , y j
pY y j
pXY xi , y j
p X xi
300
Conditional PMF - 2
Conditional Density - 1
P X xi |Y y j
P X xi P Y y j
P X xi p X ( xi )
P Yyj
P Y yi | X xi
P X xi P Y y j
P Y y j pY ( y j )
P X xi
P Y y j P X xi 0
Hence * and ** are undefined for continuous RV. Fortunately,
the numerators are also zero
We say that * and ** are limiting cases for continuous RV
For X and Y jointly continuous, we obtain the following
x y y
f XY , d d
y
P X x | y Y y y
y y
fY d
y
P Y yj
Similarly
ab g z dz b c g (c), a c b
FY y | xi FXY Y | X xi P Y y | X xi
P Y y, X xi
P X xi
P X x | y Y y y
301
Conditional Density - 2
x
f XY , y d
FXY x | y
fY y
f XY x | y f X x ,
Consequently
f XY y | x fY y
f x, y
d
f XY x | y FXY x | y XY
,
dx
fY y
302
y y
f XY x | y
FXY x | y lim P X x | y Y y y
Also,
Conditional Density - 3
f XY y | x
y f XY , y ' d
, y y ', y '' y y
y fY y ''
f x, y
d
FXY y | x XY
fX x
dy
f XY y | x f X x
,
fY y
f XY y | x
f XY x | y fY y
fX x
303
304
Conditional Expectation - 1
Conditional Expectation - 2
Theorem 1:
For any random variables X and Y E[ E Y | x ] E Y
yfY ( y | x)dy, continuous
E Y | x
y j pY y j | x , discrete
j
Proof
E Y | x f ( x)dx
E[ E Y | x ]
X
yf
y | x f ( x) dxdy
XY
X
y f XY x, y f ( x)dxdy
E[ E Y | x ]
f x X
X
yf
( x, y ) dxdy
XY
E[Y ]
305
306
Cov X , Y E[ X X Y Y ]
A) Covariance:
E[ XY Y X X Y X Y ]
E[ XY ] Y E X X E Y X Y
E[ XY ] Y X X Y X Y
Thus
C XY Cov X , Y
Cov X ,Y E[ XY ] X Y RXY X Y
Note:
If X and Y are either independent or uncorrelated, then
E[ XY ] E[ X ]E[Y ] Cov XY 0
E[ X X Y Y ]
- x X y Y f XY ( x, y )dxdy
307
308
B) Correlation Coefficient ()
The normalized 2nd order joint central moment is called the
Correlation Coefficient
Cov XY
XY
, 1 XY 1
XY
By definition,
XY E
where
LMa X fa X
N
X
fO, 1
PQ
XY
Transformation in Two
Dimension
2X Var X X 2X
Y2 Var Y Y Y2
Ilyas Kassam
309
310
Z g ( x, y)
Z aX bY y
z aX
b
Y
z
b
z
a
X
Z=aX+bY
aX+bY < Z
FZ z P Z z P g ( x, y ) RZ
R f ( x, y )dxdy
f Z z P z Z z dz
R f ( x, y )dxdy
(c) Prof. Okey Ugweje
z ax
FZ z P Z z P aX bY z b f ( x, y )dydx
311
312
d
FZ z
dz
zz
z ax
b
y
FZ z
f Z z
Note:
f ( x) f ( y)dydx
Signal + noise
f ( x) f z ax dx ...................................... A1
b
h(t)
Receiver Output
(s+n)
a(t )*b(t ) A( f ) B( f )
313
314
Examples of Convolution
Department of Telecommunications Engineering
fY(y)
fz(z)
a+c
b+d
a+c
b+d
fY(y)
fz(z)
Transformation in 2 Dimensions
(1 function of 2 RVs)
315
316
Let
Let
Z X Y
Z X Y , z x y
such that
such that
a f E aZ a ff
E a X Y a ff E ka X f aY fp
var Z E Z z
E Z E X Y x y f ( x, y )dxdy
Expanding
xf ( x)dx yf ( y )dy
E Y Y
fa
2E X X Y Y
That is
Var Z Var X Var Y 2Cov X ,Y
E Z E X Y E X E Y
2Z 2X Y2 2CXY
2Z 2X Y2 2 X Y XY
E Z E aX bY aE X bE Y
Federal University of Technology, Minna
317
318
Z ( ) E e j ax by X (a ,b )
ja X
jb Y
Z ( ) E e 1 E e 2
Z aX bY
Var Z E X X
Hence
xf ( x, y )dxdy yf ( x, y )dxdy
X (a1 )Y (b 2 )
319
320
Let X1 & X2 be jointly continuous RVs with joint PDF fX1X2(x1, x2)
We want to find the PDF of the random variables Z = X1 + X2
Define two new RVs Y1 and Y2 as a function of X1 and X2
Z aX bY
M Z (t ) E et ax by
Y1 g1 x1 , x2 ; Y2 g 2 x1 , x2
t ax by
f x, y dxdy
e
x1 h1 y1 , y2 ; x2 h2 y1 , y2
t ax
t by
M Z (t )
e
f x dx
e
f y dy
M X (t ) M Y (t )
Note
The technique for the sum of two RVs is applicable to the
difference of two RVs
321
J h1h2
h1
y1
y1, y2
h2
y1
h1
y2
0 Jacobian
h2
y2
322
Jh h
1 2
h1w, z
w, z w
h2 w, z
w
a f
h1w, z
1 0
z
1
h2 w, z 1 1
z
Z X Y
f
(w, z w)dw
XY
f z (z)
a f
a f
z x y g1 x, y
w x g2 x, y
f
( x, z x)dx
XY
f z ( z)
x w h1w, z
y z x h2 w, z z w
f ( x) f (z x)dx f (z y) f ( y)dy
Y
Y
X
X
323
324
Z XY
fwz (w, z) f XY ( x, y) 1 f XY e x, wz j 1
a f
a f
z xy g1 x, y
w x g2 x, y
f z ( z)
1
f XY x, z dx
x
x
e j
x w h1w, z
y z h2w, z
w
Jh h
1 2
h1w, z
w, z w
h2 w, z
w
a f
h1w, z
1
z
z
h2 w, z
w2
z
0
1
1
w
w
325
a f
a f
z x g1 x, y
y
w y g2 x, y
f z ( z)
a f
y f XY zy, y dy
f ( z ) y f X zy fY y dy 2 x f X x fY x dx
z
z
z
x zy h1w, z
y w h2w, z
326
fwz (w, z) f XY ( x, y) w f XY e x, wz j w
1 2
Z X
Y
a f
Jh h
1
1
f ( z ) f X x fY z dx f X z fY y dy
z
x
x
x
y
Maximum Function
h1w, z
z w
z
w
h2 w, z 1 0
z
Federal University of Technology, Minna
Z max X,Y
Minimum Function
Z min X,Y
327
328
x x 2
exp
2 x2
2 2
1
fX x
exp y y
2
2
2
If X and Y are jointly Gaussian, then joint PDF is
fY y
c h
f XY x, y
2
y
1
2 x y 1 2xy
where |xy|1
(c) Prof. Okey Ugweje
LM LMMFH
expM N
MM
N
x x
x
I 2 2 xy F x x I FG y y IJ FG y y IJ 2 OP O
K
H x K H y K H y K PQ P
PP
2FH1 2xy IK
PQ
329
330
331
332
P[ X 1 x1 , X 2 x2 , , X N xN ],
FX ( x1 , , xN )
P[ X 1 x1 , X 2 x2 , , X N xN ]
Continuous
Discrete
X = X1 , X 2 , , X N
333
334
FX ( x1,, xN )
x1x2x N
FX ( x1,, xN 1) FX ( x1,, xN 1, )
FX ( x1, x2 ) FX ( x1, x2,)
x x
n n 1
x
1
FX ( x1 , , xn ) f X ( x1 , , xn )dx1dx2 dx
n
335
336
The N Random Variables X1, X2, ..., XN are independent if the events
X1 x1, X2 x2, ..., Xn xn are independent
This implies that
FX ( x1,, xN ) FX ( x1) FX ( x2 )FX ( xN )
f X ( x1)
z z z f
( x1,, xn )dx2dxn
f X ( x1,, xN ) f X ( x1) f X ( x2 ) f X ( xN )
pX ( x1,, xN ) pX ( x1) pX ( x2 ) pX ( xN )
It follows that any subset of xi is a set of independent random variables
For example for N = 3 and x1, x2, x3 are independent, then
f X ( x2 x4 ) f X ( x1 , x2 , x3 , x4 ) dx1dx3
f X ( x1, x2 ) f X ( x1) f X ( x2 )
f X ( x1, x3) f X ( x1) f X ( x3)
f X ( x2 , x3) f X ( x2 ) f X ( x3)
Federal University of Technology, Minna
337
338
f X ( x1, x2 ) f X ( x1) f X ( x2 )
f X ( x1, x3) f X ( x1) f X ( x3)
f X ( x2 , x3) f X ( x2 ) f X ( x3)
f X ( x2 , x3 )
but
This implies that we can use chain rule to write the joint pdf as
Correspondingly,
FX ( xN ,, xk 1| xk ,, x1)
(c) Prof. Okey Ugweje
339
z z
xN
xk 1
340
z z bx ,x ,,x g f
( x1 ,, x N )dx1dxN
n n
1 2
z z
h z z gbx ,x ,,x g f
E[ x1 1 , x2 2 , , xNn ]
X N f X (X N )dXN
E[ x1, x2 ,, x N ]
( x1,, x N )dx1dxN
341
For N random variables X1, X2, , XN, the (n1 + n2 + + nN)order joint moment are defined by
h c X h c X h ]
z z b X g b X g b X g
E[ X 1 1
n1
n
1
n2
X 1, 2 ,, N E[e jb 1 X1 2 X2 ,, N X N g]
nN
n
2
342
z z
If Independent,
X 1, 2 ,, N E[e jb 1 X1g]E[e jb 2 X 2 g]E[e jb N X N g]
f X ( x1,, x N )dx1dxN
c h c h
c h
X 1 X 2 X N
z z
If independent,
t X
t X
t X
M X t1 , t2 , , t N E[e 1 1 ]E[e 2 2 ] E[e N N ]
M X t1 M X t2 M X t N
(c) Prof. Okey Ugweje
343
344
b g a2 f1 K
f X XN
N
2
1
2
LM 1 aXmf K aXmfOP
N2
Q
exp
R|
C S
|TC
ij
2
Xi ,
i j
Xi X j , i j
LM x OP
x
X M P,
MM PP
Nx Q
LMC
C
KM
MM
NC
LM OP
mM P
MM PP
N Q
1
2
21
C12
C22
CN 2
11
N1
C1N
C2 N
CNN
OP
PP
PQ
22
345
LM
MN
2
1
1 2
22
OP
PQ
346
LM
N
OP
Q
E ak X k ak E X k
k 1
k 1
LM
N
OP
Q
k 1
N X 1 X 2 X N N1 X k
N
E ak X k ak E X k
Independence is assumed
k 1
LM
N
OP RS c
Q T
gUVW
h b
a a E c X E X hb X E X g
a Varb X g a a Covb X X g
N
Var ai Xi E a j X j E X j ak Xk E Xk
i 1
j 1
k 1
N
j 1k 1
N
j 1
2
j
j 1k 1
jk
(c) Prof. Okey Ugweje
k 1
347
1 N
E
N k 1
X k N1 N
348
LM
N
1
X
N k 1 k
OP ELMFH
Q N
1
X
N k 1 k
IK OP
Q
hence
1 N
1 N N
2 E X E X X2
Var N 2 X2
X
k
N j 1
N j 1 k 1 j
2
X
1
1
Var N E X X k X2
j
N
k 1
N j 1
1
N2
X2
j k
1
(N (N
N2
1) X2 X2
X2
N
This means that the variance of n is 1/n times the variance of the RV
Xk
2
E X j Xk X
N
N 2 j 1 k 1
Var N 0 as N
1 N
1 N N
2 E X 2j 2 E X j X k X2
N j 1
N j 1 k 1
This implies that the probability that n is close to the true mean
approaches zero as N becomes larger and larger
j k
but
E X 2 X2 E X X2 X
2
349
350
Substituting
Stochastic Processes
(a.k.a. Random Processes)
2
P N a 2
Na
351
352
Definitions
353
Stochastic Processes - 1
354
Stochastic Processes - 2
355
356
Stochastic Processes - 3
Stochastic Processes - 4
S
A realization,
sample path,
or sample
function
x2(t)
S1 S
2
Sn
xn(t)
t
tk
tk+1
Observation interval
X (t j , ) = X (t j ) is a random variable
X j t, , j 1, 2,, n
X (t , ) = X (t ) is a random process
X (ti , j ) = X (ti , j ) is a real number
357
Stochastic Processes - 5
358
Stochastic Processes - 6
359
360
Stochastic Processes - 7
Stochastic Processes - 8
361
Random Process=
Discrete RP (countable collection of RVs)
Stochastic Processes - 9
a f
a f
X1 X t1,s , X2 X t2 ,s ,, Xk X tk ,s ,
FX x,t P Xt x
af
af
af
fX x1,, xk ; t1,, tk
k FX x1,, xk ; t1,, tk
x1x2xk
a f
f X x, t FX x,t
x
af
af
pX x1,, xk P X1 x1, X2 x2 ,, Xk xk ,
(c) Prof. Okey Ugweje
362
Stochastic Processes - 10
a f
363
364
Stochastic Processes - 11
Department of Telecommunications Engineering
fX x1, x2 ;t1,t2
2 FX x1, x2 ; t1, t2
x1x2
f X ( x1 , x2 , xn , t1 , t 2 , t n )
As in random variables, marginal cdf and pdf of a RP is given
by
a f a
a f z f ax , x ;t ,t fdx
f X x1;t1
2 1 2
365
366
A.Mean of X(t)
First-order Random Processes (i.e., function of one
random process)
The mean of a random process X(t) is given by
m X (t ) X (t ) E X t
x(t ) f X x, t dx
367
X t
1 T
x (t ) dt
2T T
X t
1
T
T
2T
2
or
x (t ) dt
368
C. Autocorrelation of X(t)
Autocorrelation Function (ACF) of a RP x(t) is denoted
as either RXX(t1,t2) or RX(t1,t2) or RXX(t, t+)
Autocorrelation of a random process X(t) is given by
B. Variance of X(t)
The variance of a random process X(t) is given by
X2 (t ) Var X t
2
E X t X t
E X 2 t E X t
RXX t1 , t2 E X t1 X * t2
x1 x2 f x1 , x2 ; t1 , t2 dx1dx2
It follows that
RXX t1, t2 E X t2 X * t1
a f
af af
a f
369
b a f a fg
a f
370
a f a f
bf
RX 0 E X 2 t
RX RX
This implies that we may also define the autocorrelation
function as
bf b f
RX E X t X t
371
372
Proof:
Consider
c b f b fh
b f bf
E X t X t
RX E X t X t
but
b f
bf
bf b f
E X 2 t E X 2 t E X t X t
2 RX 0 2 RX 0
Hence
RX 0 R X
RX E X t X t
E X t X t
RX
RX 0 RX
bf b f
RX E X t X t E A2 A2
5. If X(t) has a periodic component, then RX() will also
have a periodic component with the same period
373
D. Autocovariance of X(t)
The autocovariance of a RP X(t) is given by
CX t1,t2 E X t1 X t1 X t2 X t2
E. Correlation Coefficient
The correlation coefficient of a RP X(t) is given by
a f l a f a fql a f a fq
E Xat f Xat f at fXat f at fXat f at f at f
E Xat f Xat f at f at f
1
374
X t1 , t2
C X t1 , t2
C X t1 , t1 C X t2 , t2
Thus
where
X t1 , t2 1
C X t1 , t2 RX t1 , t2 X t1 X t2
375
376
377
378
Hence
RSx(t),
T 0,
T t T
else
X T f TT xT (t )e
j 2 ft
dt
af
af
S X f lim 1 E X T f
T
2T
1
2
S X f
XT f
2T
(c) Prof. Okey Ugweje
379
380
R ( )e j2f d , continuous
af
R
a f |Sz R (k )e j2kf ,
|T
SX f
discrete
Conversely
af z
RX F 1 S X f
S X f
RX e j 2 f d
RX cos 2 f j sin 2 f d
WienerKhintchine
Theorem
af
RX S X f
RX
cos 2 f d
RX j sin 2 f d
RX cos 2 f d
S X ( f )e j2f df
381
SX ( f )e j2f df
382
RX 0 E X 2 t
SX ( f )df
This is the area under the PSD curve. It is also known as the
Average Power
Conversely,
SX 0
RX ( )d
383
384
FX t t FX t t c FX t
Federal
University of Technology,
Minna
Federal University
of Technology,
Minna
f X t t f X t t c f X t
385
386
f X ( x1 , x2 , xn , t1 , t2 , tn )
f X ( x1 , x2 , xn , t1 c, t2 c , tn c)
where left side represents the joint PDF of the RVs
(c) Prof.
OkeyUgweje
Ugweje
Prof.
Okey
X1 X (t1 c),
X 2 X (t2 c), ,
X n X (tn c).
ti , i 1, 2, , n, n 1, 2, and any c.
Federal
University of Technology,
Minna
Federal University
of Technology,
Minna
387
388
389
E Xt E Xt
2.Its autocorrelation (or autocovariance) depends
only on = t1 - t2
R
E X t X t
390
Note:
If a process is SSS, then it is also WSS
The converse is not true except when the process is
Gaussian, i.e., for a Gaussian Process, WSS also
implies SSS
Stochastic Process
WSS
m X (t ) E X t kT
RX RX t kT
SSS
391
392
bf
bf
Xn t E Xn t
2. Ergodic in Autocorrelation
bg bg
b g
X t1 X t2 RX t1, t2
393
394
Y1
Y2
Yk 1
bg bg
Xbt g Xbt g
X t2 X t1
3
bg b g
X tk X tk 1
395
396
P X t x | x
, , x P X t x | x
k
k k 1
1
k
k k 1
397
398
Cross-Correlation Function
399
RXY t1 , t2 E X t1 Y t2
x(t ) y (t ) f x, y dx(t )dy (t )
1
2 XY
1
2
400
Properties of CCF - 1
Properties of CCF - 2
RXY E X t Y t
RYX E Y t X t
Thus
RXY
RXY RX 0RY 0
RYX
bf
b f
b f
E X 2 t E Y 2 t 2 E X t Y t 0
RX 0 RY 0 2 RXY 0
RXY 1 RX 0 RY 0
2
401
Properties of CCF - 3
402
Properties of CCF - 4
RXY RYX
RXY E X t Yt E X t E Yt
E Yt E X t
E Yt X t
RYX
RXY t1 ,t2
XY
YX
XY
YX
XY
RZ RX RY RXY RYX
SZ S X SY S XY SYX
S X SY 2S XY
RXY RYX
(c) Prof. Okey Ugweje
403
404
Properties of CCF - 5
b g b g
X t , Y t , , for all
2
E X (t ) Y (t ) 0, for all t
T 2T T
XY lim
x(t ) y (t )dt
1 T
YX lim
y (t ) x(t )dt
T 2T T
Hence
XY RXY ,
405
YX RYX
C. Cross-Covariance (CC)
b g m b g b grmYbt gm bt gr
R bt ,t g m bt gm bt g
CXY t1,t2 E X t1 mX t1
XY
S XY
b g
R|z R ( )e j2f d ,
a f f S R (k )e j2kf ,
|T
XY
CXY t1,t2 0
406
407
XY
continuous
discrete
408
409
410
411
x(t)
x[n]
x(ejw)
X(f)
X(z)
RX(f)
SX(f)
Linear Network
h(t)
h[n]
H(ejw)
H(f)
H(z)
y(t)
y[n]
Y(ejw)
Y(f)
Y(z)
Ry(f)
Sy(f)
Time Function
Difference Equation
Pole-Zero Plot
H - Function
Random Process
412
Y t
h( ) X (t )d
y t h(t ) x(t )
h( ) x(t
h(t
h(t ) X ( )d
)d
Mean:
) x( )d
E Y t E h( ) X (t ) d h( ) E X (t ) d
E X t E X t mx
413
mx h( )d
E Y 2 t E
X (t s )h( s )ds
X (t r )h(r )dr
ds
X (t s ) X (t r )h( s )h(r )dr
E
ds
E X (t s ) X (t r ) h( s )h(r )dr
But
SY f RY ( )e j 2 ft d
RX ( r s )
j 2 ft
dsdrd
h s h r RX ( s r )e
Hence
E X (t s ) X (t r ) RX (t s t r )
414
415
416
SY f h s h r RX (u )e j 2 f ( u s r ) dsdrdu
h s e j 2 fs ds h r e j 2 fr dr RX (u )e j 2 fu du
H ( f ) H ( f )S X ( f )
S XY f H f S X f
H ( f ) SX ( f )
Cross Relationships between input and output processes:
RXY E X (t )Y (t )
S XY f SYX
f H f SX f
E X (t ) X t r h(r )dr
E X (t ) X t r h(r )dr
RX r h(r )dr
RX h( )
(c) Prof. Okey Ugweje
417
418