Beruflich Dokumente
Kultur Dokumente
f W (w ) =
n =1
1
( wn wn 1 ) 2 /[ 2 (t n t n 1 )]
e
.
2 (t n t n 1 )
EE 214
Probability and Random Processes
Slide No.1
EE 214
Probability and Random Processes
Slide No.2
The Call Originating Process ( Use of Poisson Process to Model Call Arrivals)
Lets consider random call origination, where t 0 (Figure 1):
(1) The probability that a call originates in the time interval (t, t + t] is equivalent to t
independent of t, where is a constant.
(2) The probability that two or more calls originate in (t, t + t] as t 0 approaches zero.
(3) Calls originate independent of other calls.
EE 214
Probability and Random Processes
Slide No.3
n t t
pk (t ) = lim 1
n k n
n
nk
( t ) k t
=
e ,
k!
which is the Poisson distribution w/ mean t, where is called the arrival rate or origination rate.
is also interpreted as the mean number of arrivals per unit time which is equal to c ( in a = ch ).
The probability that no calls are originated in (0,t] is given by
( t ) k t
pk (t ) =
e
k!
p0 (t ) = e t .
k =0
Take note that pk(t) is also the p(interarrival time Xn > t) so that pk(Xn t) = 1 pk(Xn > t). Hence,
the probability of a call arrival is
A(t ) = 1 e t .
Therefore, a random call origination process has an exponential interarrival time distribution.
Service Time Distribution
Assume a random call termination rate , also called as the mean service rate. Since calls are
randomly terminated w/ random call holding times, then the situation is very similar to the call
origination process. By applying the procedure above, we arrive at the result that the service time
has an exponential distribution. That is,
H (t ) = 1 e t .
1st Semester 2012-2013
EE 214
Probability and Random Processes
Slide No.4
Review
Covariance The covariance of two random variables X and Y is
Cov[ X , Y ] = XY = E[( X X ) (Y Y )] .
Correlation The correlation of two random variables X and Y is
rXY = E[ XY ] .
Theorem:
rXY = E[ X 2 ] = E[Y 2 ] .
EE 214
Probability and Random Processes
Slide No.5
C X (t , ) = Cov[ X (t ), X (t + )] .
Autocovariance The autocovariance of the random sequence Xn, is
C X [ m, k ] = Cov[ X m , X m + k ] .
Recall: Cov[X,Y] is an indication of how much information is being carried by rv X about rv Y.
The higher the covariance, the higher the accuracy of the information indicated by X on Y.
Consider a random process X(t). At time intervals t1 seconds and t2 = t1 + seconds, the covariance
is inversely proportional to the rate of change during the seconds.
EE 214
Probability and Random Processes
Slide No.6
That is, a small covariance value (near zero) indicates rapid change during the seconds.
Autocorrelation Function The autocorrelation function of the stochastic process X(t), is
R X (t , ) = E[ X (t ) X (t + )] .
EE 214
Probability and Random Processes
Slide No.7
Theorem 7.1 The autocorrelation and the autocovariance functions of a stochastic process X(t),
satisfy,
C X (t , ) = R X (t , ) X (t ) X (t + )] .
The autocorrelation and the autocovariance functions of a random sequence Xn satisfy
t1 and t1 +
f X (t1 ) ( x) = f X (t1 + ) ( x) = f X ( x) .
Thus for a stationary process, the same random variable is observed for all time instants. The key
idea in the stationary property is that the statistical properties of the random process do not change
with time.
Theorem 7.2 Let X(t) be a stationary random process. For constants a > 0 and b,
Y(t) = aX(t) + b is also a stationary process.
1st Semester 2012-2013
EE 214
Probability and Random Processes
Slide No.8
Theorem 7.3 For a stationary process X(t), the expected value, the autocorrelation, and the
autocovariance have the ff. properties for all t:
(a) E[X(t)] = X ,
For a stationary random sequence Xn the expected value, the autocorrelation, and the
autocovariance satisfy for all n,
(a) E[Xn] = X ,
This shows the time-invariant properties of stationary processes. That is, the expected value,
the autocovariance and the autocorrelation properties are independent of t, and dependent only on
the time difference variable .
Wide Sense Stationary Stochastic Processes (WSS)
WSS Process X(t) is a wide sense stationary stochastic process if and only if for all t,
E[X(t)] = X ,
and
WSS Sequence Xn is a wide sense stationary random sequence if and only if for all n,
E[Xn] = X ,
and
Stationary processes (also called Strict Sense Stationary (SSS)) are also WSS but NOT the
other way around.
1st Semester 2012-2013
EE 214
Probability and Random Processes
Slide No.9
Theorem 7.4 For a WSS process X(t), the autocorrelation function RX() has the ff. properties
RX[0] 0,
RX() = RX(-),
RX[0] | RX() | .
lim X (t ) = X .
When the time average of the sample function of a WSS process is equal to the ensemble average
then the process is ergodic.
This simply means that we take the time average of any sample function of a WSS process, this
will be the same as the ensemble average (Note pls. review slide no. 1).
Cross-Correlation Function The cross-correlation of continuous time random processes X(t) and
Y(t) is
R XY (t , ) = E[ X (t ) Y (t + )].
Cross-Correlation Function The cross-correlation of random sequences Xn and Yn is
R XY (m, k ) = E[ X m Ym + k ].
Jointly Wide Sense Stationary Continuous time random processes X(t) and Y(t) are jointly WSS if
both are WSS and the cross-correlation depends only on the time difference between the two rvs:
R XY (t , ) = R XY ( ) .
EE 214
Probability and Random Processes
Slide No.10
Jointly Wide Sense Stationary Random sequences Xn and Yn are jointly WSS if both are
WSS and the cross-correlation depends only on the time difference between the two rvs:
R XY [ m, k ] = R XY [k ] .
Theorem 7.5 If X(t) and Y(t) are jointly WSS continuous time processes, then
R XY ( ) = RYX ( ) .
If Xn and Yn are jointly WSS random sequences, then
R XY [k ] = RYX [ k ] .
u = t +
EE 214
Probability and Random Processes
Slide No.11
Example 7.1: Suppose we are interested in X(t) but we can only observe
Y (t ) = X (t ) + N (t ) ,
where N(t) is a noise process that interferes with our observation of X(t). Assume that X(t) and N(t)
are independent WSS processes with E[X(t)] = X and E[N(t)] = N = 0.
(a) Is Y(t) a WSS process? (b) Are X(t) and Y(t) jointly WSS? (c) Are Y(t) and N(t) jointly WSS?
(a) For Y(t) we obtain:
Y (t ) = X (t ) + N (t ) E[Y (t )] = E[ X (t )] + E[ N (t )] = X + 0 = X
The autocorrelation is:
RY (t , ) = E[Y (t )Y (t + )] = E[( X (t ) + N (t ))( X (t + ) + N (t + ))]
= E{ X (t )[ X (t + ) + N (t + )] + N (t )[ X (t + ) + N (t + )]}
= E[ X (t ) X (t + )] + E[ X (t ) N (t + )] + E[ N (t ) X (t + )] + E[ N (t ) N (t + )]
= R X ( ) + R XN (t , ) + RNX (t , ) + RN ( )
R XN = 0.
Hence we have:
RY (t , ) = R X ( ) + RN ( ) .
Since we have E[X(t)] = 0, and RY(t, ) depends only on so that we say that Y(t) is WSS.
1st Semester 2012-2013
EE 214
Probability and Random Processes
Slide No.12
(b) To determine if Y(t) and X(t) are jointly WSS, we calculate the cross-correlation,
RYX (t , ) = E[Y (t ) X (t + )] = E[( X (t ) + N (t )) ( X (t + ))]
= E[ X (t ) X (t + )] + E[ N (t ) X (t + )] = R X (t , ) + RNX (t , )
= R X ( ) + RNX (t , ) = R X ( )
EE 214
Probability and Random Processes
Slide No.13
Example 7.2 X(t) is a WSS stochastic process with autocorrelation function RX(). Y(t) is identical
to X(t), except for the reversed time scale, i.e., Y(t) = X(-t) .
(a) Express the autocorrelation function of Y(t) in terms of RX() . Is Y(t) WSS?
(b) Express the cross-correlation of X(t) and Y(t) in terms of RX() . Are X(t) and Y(t) jointly WSS?
(a) The autocorrelation function of Y(t) is
RY (t , ) = E[Y (t )Y (t + )] = E[ X ( t ) X ((t + ))]
= R X (t , ) = R X ( ) = R X ( ) .
RY (t , ) = E[Y (t )Y (t + )] = R X ( ) = R X (t , )
= E[ X (t ) X (t + )] E[Y (t )] = E[ X (t )] = Y = X .
This implies that Y(t) is WSS.
(b) Solution (Hint: use a method similar to the proof of Theorem 7.5.
EE 214
Probability and Random Processes
Slide No.14
Example 7.3 At the receiver of an AM radio, the received signal is composed of carrier signal at
the carrier frequency fc (set to a fixed value) with random phase that is a sample value of a
uniform (0, 2) random variable. The received signal can be modeled as
X (t ) = A cos(2f c t + ).
Determine the expected value and the autocorrelation of the random process X(t). Is X(t) WSS?
From the definition of the PDF of a uniform rv, we determine the PDF of the phase to be
1 /(2 ) 0 2 ,
f ( ) =
0
otherwise.
since
E[ X (t )] = E[ A cos(2 f c t + )] = 0,
by using the previous result above.
1st Semester 2012-2013
EE 214
Probability and Random Processes
Slide No.15
Using the identity, cos A cos B = [cos( A B) + cos( A + B)] / 2 , then we have
cos[2 f c t + 2 f c (t + ) ] = cos(2 f c )
cos[2 f c t + + 2 f c (t + ) + ] = cos[2 f c (2t + ) + 2]
A2
R X (t , ) =
E [cos(2 f c ) + cos(2 f c (2t + ) + 2)]
2
A2
[E[cos(2 f c )] + E[cos(2 f c (2t + ) + 2)]]
=
2
Note that letting = 2 f c (2t + ), and k = 2 , the second term becomes
E[cos(2 f c (2t + ) + 2)] = E[cos( + k)] = 0 .
A2
Hence
R X (t , ) =
E[cos(2 f c )] = R X ( ).
2
Note that X(t) is often used to represent current and voltage as a function of time. We have shown
here that X(t) is a WSS random process.
1st Semester 2012-2013
EE 214
Probability and Random Processes
Slide No.16
Ergodicity
Previously, we briefly stated the ff. :
Ergodic Process A WSS process X(t) is ergodic when limt X (t ) = X .
When the time average of the sample function of a WSS process is equal to the ensemble average
then the process is ergodic. What is the significance of this statement ?
In practice the statistical description of random processes are not available. So what is done is to
obtain time-based measurements and use the corresponding time averages. Fortunately, due to the
stationary properties of the random process X(t), these time averages can substitute for the ensemble
averages. This can be done because of the concept of Ergodicity. That is, if X(t) is can be modeled
as ergodic, then the time averages can be used as substitute for the ensemble averages.
More accurately, the above limit, in which the time average equals the ensemble average as t
describes what is termed as ergodic in the mean.
Other types of ergodicity are ergodic in power, in correlation or in the probability distribution.
Ergodic in the mean A WSS random process X(t) is ergodic in the mean if the time average
converges to the ensemble average E[X(t)] = X ,that is
1 T
M (t )
X (t )dt X as T .
2T T
From the above equation M is the time average random variable. The mean and variance can be
computed using the theory of mean square integrals, obtaining the ff:
1st Semester 2012-2013
EE 214
Probability and Random Processes
Slide No.17
[ ]
T
1
1 T
2
=
C (t t 2 )dt1 dt 2 .
[
]
E M =
E
X
(
t
)
dt
=
and
M
2 T X 1
X
(2T )
2T
Note that the basic idea here is that for an ergodic random process, we can use the time average
of this process (which is also WSS) since this is equal to the ensemble average.
For example, an electrical signal X(t) modeled as a sample function of an ergodic process, X and
E[X2(t)] and other ensemble averages are measured and observed using normal instruments.
Average Power The average power of a WSS process X(t) is RX(0) = E[X 2(t)] .
Average Power The average power of a WSS sequence Xn is RX(0) = E[Xn2] .
Instantaneous power across a resistor R is either obtained as v2(t)/R = i2(t)R . Here it is
implicitly assumed that R = 1 . When x(t) is used to represent the voltage or current then the
instantaneous power is then expressed as x2(t). The random variable X 2(t) is then referred to as
the instantaneous power of the process X(t).
If X(t) is used to model a voltage, the time average of the sample function x(t) over an interval 2T
is given by:
1 T
X (T ) =
x(t ) dt .
2T T
This is the DC voltage of x(t) which can be measured by a voltmeter.
The time average of the power of a sample function is then
1
X (T ) =
2T
2
(t ) dt .
EE 214
Probability and Random Processes
Slide No.18
Gaussian Processes
As discussed in the topic on Central Limit theorem, there is the prevalence of natural phenomena
that can be modeled sufficiently by Gaussian random variables. This also holds true for stochastic
processes, i.e., a lot of natural random processes can be modeled as Gaussian random processes.
Gaussian Process X(t) is a Gaussian random stochastic process if and only if X = [ X (t1 ) L X (t k )]
is a Gaussian random vector for any integer k > 0 and any set of time instants t1 , t 2 , K ,.t k
Gaussian Sequence Xn is a Gaussian random sequence if and only if X = [ X n1 L X n k ]
is a Gaussian random vector for any integer k > 0 and any set of time instants n1 , n2 , K ,. nk
Gaussian Random Vector X is a Gaussian (X, CX) random vector with expected value X and
covariance CX if and only if
1
1
1
f X ( x) =
exp
(x X )C X (x X ) ,
1/ 2
n/2
(2 ) [det (C X )]
2
EE 214
Probability and Random Processes
Slide No.19
The white Gaussian noise process is a mathematical model and cannot be reproduced by any
electrical signal:
E[W 2 (t )] = Rw (0) = .
That is, white noise has infinite power, a physical impossibility. The model is useful in the sense
that any Gaussian noise signal observed in practice is interpreted as a filtered white Gaussian
noise signal with finite power.
Example 7.4: X(t) is a stationary Gaussian random process with X (t ) = 0 and autocorrelation
function R X ( ) = 2 | | . What is the PDF of X(t) and X(t+1) .
From the problem statement we obtain the ff:
E[ X (t )] = E[ X (t + 1)] = 0;
1 1
R X (t , ) = R X ( ) = E[ X (t ) X (t + )] = 2 = =
2
2
= 1, E[ X (t ) X (t + 1)] = 1 / 2 , Var[ X (t )] = Var[ X (t + 1)] = 1
| |
The joint PDF of X(t) and X(t+1) is the Gaussian vector PDF,
(x 0 x 0 x1 + x1 )
1
1
1 1
3
f X (t ), X (t +1) ( x0 , x1 ) =
exp
x
C
x
=
exp
X
n/2
1/ 2
3
( 2 ) [det(C X )]
2
EE 214
Probability and Random Processes
Slide No.20
where:
1 1 / 2
CX =
,
1 / 2 1
C 1
X =
1 / 2
4 1
,
1
3 1 / 2
1 / 2 x0 4 2
4 1
xC X1x = [x0 x1 ]
= x0 x0 x1 + x12
1 x1 3
3 1 / 2
EE 214
Probability and Random Processes
Slide No.21
EE 214
Probability and Random Processes
Slide No.22