Sie sind auf Seite 1von 22

Brownian Motion Process

The Poisson process is an example of a continuous-time, discrete-value stochastic process.


The Brownian motion process is a continuous-time, continuous-value stochastic process.
Describes the movements caused by the random collisions of small particles w/ water molecules in
thermal motion. The ff. definition describes a Brownian motion along the x axis.
Brownian Motion Process A Brownian motion process W(t) has the property that W(0) = 0 and
for > 0, W(t + ) W(t) is a Gaussian(0, ) random variable that is independent of W(t) for all
t t.
For a Brownian motion process, the PDF of a sample vector W = [W(t1), . . . , W(tk)] is presented
in the ff. theorem.
Theorem 6.7. For the Brownian motion process W(t), the joint PDF of W = [W(t1), . . . , W(tk)] is
k

f W (w ) =

n =1

1
( wn wn 1 ) 2 /[ 2 (t n t n 1 )]
e
.
2 (t n t n 1 )

Theorem 6.8. The variance of a Brownian motion process W(t) is Var[W(t)] = t.

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.1

Modeling of Teletraffic Systems


Definitions:
(a) Call demand for a connection in teletraffic (WAN - wide area network) systems.
(b) Holding time duration of a call, also called service time.
(c) Traffic Load / Traffic Intensity total holding time per unit time, in erlangs.
Example A.1. There are 3 calls per hour with holding times of 5, 10, and 15 mins, respectively. The
traffic load is then
(5 + 10 + 15) min
a=
= 0.5 erlang.
60 min
The traffic load has the ff. characteristics:
(1) Let c be the no. of calls originating per unit time, and h the mean holding time. Then the traffic
load or traffic intensity a is given by a = ch erlang.
(2) The traffic load is equal to the number of calls originating multiplied by the mean holding time.
(3) The traffic load carried by a single trunk is equivalent to the probability (fraction of time) that
the trunk is occupied (busy) with traffic.
(4) The traffic load carried by a group of trunks is equivalent to the traffic on the mean (expected)
number of busy trunks in the group.
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.2

The Call Originating Process ( Use of Poisson Process to Model Call Arrivals)
Lets consider random call origination, where t 0 (Figure 1):
(1) The probability that a call originates in the time interval (t, t + t] is equivalent to t
independent of t, where is a constant.
(2) The probability that two or more calls originate in (t, t + t] as t 0 approaches zero.
(3) Calls originate independent of other calls.

Figure 1. Model for call origination (arrivals).


Our objective is to determine pk(t), the probability that k calls originate in time interval (0,t].
As shown in Figure 1, the interval (0,t] is divided into a sufficiently large number of n subsections,
so that t = t/n.
The probability that exactly k calls originate in k particular subsections out of n subsections is
obtained by taking note that these are iid RVs. That is, the probability of k calls originating in each
of the k particular subsections is given by (t) k (1 t)n k (geometric).
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.3

There are k originated calls out of n subsections (binomial), and as t 0 we have


k

n t t
pk (t ) = lim 1
n k n
n

nk

( t ) k t
=
e ,
k!

which is the Poisson distribution w/ mean t, where is called the arrival rate or origination rate.
is also interpreted as the mean number of arrivals per unit time which is equal to c ( in a = ch ).
The probability that no calls are originated in (0,t] is given by

( t ) k t
pk (t ) =
e
k!

p0 (t ) = e t .

k =0

Take note that pk(t) is also the p(interarrival time Xn > t) so that pk(Xn t) = 1 pk(Xn > t). Hence,
the probability of a call arrival is
A(t ) = 1 e t .
Therefore, a random call origination process has an exponential interarrival time distribution.
Service Time Distribution
Assume a random call termination rate , also called as the mean service rate. Since calls are
randomly terminated w/ random call holding times, then the situation is very similar to the call
origination process. By applying the procedure above, we arrive at the result that the service time
has an exponential distribution. That is,
H (t ) = 1 e t .
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.4

Review
Covariance The covariance of two random variables X and Y is

Cov[ X , Y ] = XY = E[( X X ) (Y Y )] .
Correlation The correlation of two random variables X and Y is
rXY = E[ XY ] .

Theorem:

(a) Cov[ X , Y ] = rXY X Y

(b) Var[ X + Y ] = Var[ X ] + Var[Y ] + 2 Cov[ X , Y ] .


(c) If X = Y , Cov[ X , Y ] = Var[ X ] = Var[Y ] and

rXY = E[ X 2 ] = E[Y 2 ] .

Theorem: For independent random variables X and Y,


(a) E[ g ( X )h(Y )] = E[ g ( X )] E[h(Y )],
(b) rXY = E[ XY ] = E[ X ] E[Y ],
(c) Cov[ X , Y ] = XY = 0,
(d ) Var[ X + Y ] = Var[ X ] + Var[Y ].

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.5

Expected Value and Correlation


In a previous discussion, we said that for a stochastic process X(t), X(t1), the value of a sample
function at time instant t1 is a random variable. Hence, we can compute for the PDF f X (t1 ) ( x) and
expected value E[ X (t1 )] .
E[X(t)] is a number assigned for each value of t, so that E[X(t)] is a deterministic function of t.
Expected Value of a Process The expected value of a stochastic process X(t), is the
deterministic function
X (t ) = E[ X (t )] .
Autocovariance The autocovariance of the stochastic process X(t), is

C X (t , ) = Cov[ X (t ), X (t + )] .
Autocovariance The autocovariance of the random sequence Xn, is

C X [ m, k ] = Cov[ X m , X m + k ] .
Recall: Cov[X,Y] is an indication of how much information is being carried by rv X about rv Y.
The higher the covariance, the higher the accuracy of the information indicated by X on Y.
Consider a random process X(t). At time intervals t1 seconds and t2 = t1 + seconds, the covariance
is inversely proportional to the rate of change during the seconds.

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.6

That is, a small covariance value (near zero) indicates rapid change during the seconds.
Autocorrelation Function The autocorrelation function of the stochastic process X(t), is
R X (t , ) = E[ X (t ) X (t + )] .

Autocorrelation Function The autocorrelation function of the random sequence Xn, is


R X (m, k ) = E[ X m X m + k ] .
By looking at their definitions, autocovariance and autocorrelation are closely related. Each is
useful in certain situations.
Autocovariance is usually useful in determining the future value of a random process.
Autocorrelation is usually used to describe the power of a random signal.

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.7

Theorem 7.1 The autocorrelation and the autocovariance functions of a stochastic process X(t),
satisfy,
C X (t , ) = R X (t , ) X (t ) X (t + )] .
The autocorrelation and the autocovariance functions of a random sequence Xn satisfy

C X [n, k ] = R X [n, k ] X [n] X [n + k ] .


Stationary Process A stochastic process X(t) is stationary iff for all sets of time instants t1,,tm,
and any time-difference ,
f X (t1 ),..., X (t m ) ( x1 ,..., xm ) = f X (t1 + ),..., X (t m + ) ( x1 ,..., xm ) .
Stationary Sequence A random sequence Xn is stationary iff for all sets of integer time instants
n1,,nm, and any time integer difference k,
f X n ,..., X n ( x1 ,..., xm ) = f X n +k ,..., X n +k ( x1 ,..., xm ) .
1

Note: For a stationary process, for any time instants

t1 and t1 +

(necessary but not sufficient),

f X (t1 ) ( x) = f X (t1 + ) ( x) = f X ( x) .
Thus for a stationary process, the same random variable is observed for all time instants. The key
idea in the stationary property is that the statistical properties of the random process do not change
with time.
Theorem 7.2 Let X(t) be a stationary random process. For constants a > 0 and b,
Y(t) = aX(t) + b is also a stationary process.
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.8

Theorem 7.3 For a stationary process X(t), the expected value, the autocorrelation, and the
autocovariance have the ff. properties for all t:

(a) E[X(t)] = X ,

(b) RX(t, ) = RX(0, ) = RX()

(c) CX(t, ) = RX() 2X = CX()

For a stationary random sequence Xn the expected value, the autocorrelation, and the
autocovariance satisfy for all n,
(a) E[Xn] = X ,

(b) RX[n, k] = RX[0, k] = RX[k]

(c) CX[n, k] = RX[k] 2X = CX[k]

This shows the time-invariant properties of stationary processes. That is, the expected value,
the autocovariance and the autocorrelation properties are independent of t, and dependent only on
the time difference variable .
Wide Sense Stationary Stochastic Processes (WSS)
WSS Process X(t) is a wide sense stationary stochastic process if and only if for all t,
E[X(t)] = X ,

and

RX(t, ) = RX(0, ) = RX() .

WSS Sequence Xn is a wide sense stationary random sequence if and only if for all n,
E[Xn] = X ,

and

RX[n, k] = RX[0, k] = RX[k]

Stationary processes (also called Strict Sense Stationary (SSS)) are also WSS but NOT the
other way around.
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.9

Theorem 7.4 For a WSS process X(t), the autocorrelation function RX() has the ff. properties
RX[0] 0,

RX() = RX(-),

Ergodic Process A WSS process X(t) is ergodic when

RX[0] | RX() | .

lim X (t ) = X .

When the time average of the sample function of a WSS process is equal to the ensemble average
then the process is ergodic.
This simply means that we take the time average of any sample function of a WSS process, this
will be the same as the ensemble average (Note pls. review slide no. 1).
Cross-Correlation Function The cross-correlation of continuous time random processes X(t) and
Y(t) is
R XY (t , ) = E[ X (t ) Y (t + )].
Cross-Correlation Function The cross-correlation of random sequences Xn and Yn is
R XY (m, k ) = E[ X m Ym + k ].

Jointly Wide Sense Stationary Continuous time random processes X(t) and Y(t) are jointly WSS if
both are WSS and the cross-correlation depends only on the time difference between the two rvs:

R XY (t , ) = R XY ( ) .

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.10

Jointly Wide Sense Stationary Random sequences Xn and Yn are jointly WSS if both are
WSS and the cross-correlation depends only on the time difference between the two rvs:

R XY [ m, k ] = R XY [k ] .
Theorem 7.5 If X(t) and Y(t) are jointly WSS continuous time processes, then

R XY ( ) = RYX ( ) .
If Xn and Yn are jointly WSS random sequences, then
R XY [k ] = RYX [ k ] .

From the definition: R XY (t , ) = E[ X (t ) Y (t + )].

Making the substitution:

u = t +

R XY (t , ) = E[ X (t ) Y (t + )] = E[ X (u )Y (u )] = E[Y (u ) X (u ) = RYX (u , ) = RYX ( ) ,

From the definition of jointly WSS random processes,


R XY (t , ) = R XY ( ) = R XY ( ) .

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.11

Example 7.1: Suppose we are interested in X(t) but we can only observe

Y (t ) = X (t ) + N (t ) ,
where N(t) is a noise process that interferes with our observation of X(t). Assume that X(t) and N(t)
are independent WSS processes with E[X(t)] = X and E[N(t)] = N = 0.
(a) Is Y(t) a WSS process? (b) Are X(t) and Y(t) jointly WSS? (c) Are Y(t) and N(t) jointly WSS?
(a) For Y(t) we obtain:
Y (t ) = X (t ) + N (t ) E[Y (t )] = E[ X (t )] + E[ N (t )] = X + 0 = X
The autocorrelation is:
RY (t , ) = E[Y (t )Y (t + )] = E[( X (t ) + N (t ))( X (t + ) + N (t + ))]
= E{ X (t )[ X (t + ) + N (t + )] + N (t )[ X (t + ) + N (t + )]}
= E[ X (t ) X (t + )] + E[ X (t ) N (t + )] + E[ N (t ) X (t + )] + E[ N (t ) N (t + )]
= R X ( ) + R XN (t , ) + RNX (t , ) + RN ( )

Note that X(t) and N(t) are independent, so that:


RNX (t , ) = E[ N (t )] E[ X (t + )] = 0 , R XN (t , ) = E[ X (t )] E[ N (t + )] = 0
since Cov[ X , N ] = 0 = R XN X N = R XN 0

R XN = 0.

Hence we have:

RY (t , ) = R X ( ) + RN ( ) .
Since we have E[X(t)] = 0, and RY(t, ) depends only on so that we say that Y(t) is WSS.
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.12

(b) To determine if Y(t) and X(t) are jointly WSS, we calculate the cross-correlation,
RYX (t , ) = E[Y (t ) X (t + )] = E[( X (t ) + N (t )) ( X (t + ))]
= E[ X (t ) X (t + )] + E[ N (t ) X (t + )] = R X (t , ) + RNX (t , )
= R X ( ) + RNX (t , ) = R X ( )

Y(t) and X(t) are jointly WSS.


(c) Similarly for Y(t) and N(t) we calculate the cross-correlation,
RYN (t , ) = E[Y (t ) N (t + )] = E[( X (t ) + N (t )) ( N (t + ))]
= E[ X (t ) N (t + )] + E[ N (t ) N (t + )] = R XN (t , ) + RN (t , )
= RN ( ) + R XN (t , ) = RN ( )

Y(t) and N(t) are jointly WSS.

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.13

Example 7.2 X(t) is a WSS stochastic process with autocorrelation function RX(). Y(t) is identical
to X(t), except for the reversed time scale, i.e., Y(t) = X(-t) .
(a) Express the autocorrelation function of Y(t) in terms of RX() . Is Y(t) WSS?
(b) Express the cross-correlation of X(t) and Y(t) in terms of RX() . Are X(t) and Y(t) jointly WSS?
(a) The autocorrelation function of Y(t) is
RY (t , ) = E[Y (t )Y (t + )] = E[ X ( t ) X ((t + ))]
= R X (t , ) = R X ( ) = R X ( ) .

RY (t , ) = E[Y (t )Y (t + )] = R X ( ) = R X (t , )
= E[ X (t ) X (t + )] E[Y (t )] = E[ X (t )] = Y = X .
This implies that Y(t) is WSS.
(b) Solution (Hint: use a method similar to the proof of Theorem 7.5.

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.14

Example 7.3 At the receiver of an AM radio, the received signal is composed of carrier signal at
the carrier frequency fc (set to a fixed value) with random phase that is a sample value of a
uniform (0, 2) random variable. The received signal can be modeled as

X (t ) = A cos(2f c t + ).
Determine the expected value and the autocorrelation of the random process X(t). Is X(t) WSS?
From the definition of the PDF of a uniform rv, we determine the PDF of the phase to be
1 /(2 ) 0 2 ,
f ( ) =
0
otherwise.

For any integer k and fixed angle ,


1
d
2
1
[sin( + 2k ) sin ] = 0
=
2k
sin( + 2k ) = sin cos 2k + cos sin 2k = sin
2

E[cos( + k)] = 0 cos( + k )

since

Without loss of generality, let = 2 f c t then E[ X (t )] is

E[ X (t )] = E[ A cos(2 f c t + )] = 0,
by using the previous result above.
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.15

For the autocorrelation we have,


R X (t , ) = E[ X (t ) X (t + )]
= E[ A cos(2 f c t + ) A(cos(2 f c (t + ) + )]

Using the identity, cos A cos B = [cos( A B) + cos( A + B)] / 2 , then we have
cos[2 f c t + 2 f c (t + ) ] = cos(2 f c )
cos[2 f c t + + 2 f c (t + ) + ] = cos[2 f c (2t + ) + 2]

A2
R X (t , ) =
E [cos(2 f c ) + cos(2 f c (2t + ) + 2)]
2
A2
[E[cos(2 f c )] + E[cos(2 f c (2t + ) + 2)]]
=
2
Note that letting = 2 f c (2t + ), and k = 2 , the second term becomes
E[cos(2 f c (2t + ) + 2)] = E[cos( + k)] = 0 .

A2
Hence
R X (t , ) =
E[cos(2 f c )] = R X ( ).
2
Note that X(t) is often used to represent current and voltage as a function of time. We have shown
here that X(t) is a WSS random process.
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.16

Ergodicity
Previously, we briefly stated the ff. :
Ergodic Process A WSS process X(t) is ergodic when limt X (t ) = X .
When the time average of the sample function of a WSS process is equal to the ensemble average
then the process is ergodic. What is the significance of this statement ?
In practice the statistical description of random processes are not available. So what is done is to
obtain time-based measurements and use the corresponding time averages. Fortunately, due to the
stationary properties of the random process X(t), these time averages can substitute for the ensemble
averages. This can be done because of the concept of Ergodicity. That is, if X(t) is can be modeled
as ergodic, then the time averages can be used as substitute for the ensemble averages.
More accurately, the above limit, in which the time average equals the ensemble average as t
describes what is termed as ergodic in the mean.
Other types of ergodicity are ergodic in power, in correlation or in the probability distribution.
Ergodic in the mean A WSS random process X(t) is ergodic in the mean if the time average
converges to the ensemble average E[X(t)] = X ,that is
1 T
M (t )
X (t )dt X as T .
2T T
From the above equation M is the time average random variable. The mean and variance can be
computed using the theory of mean square integrals, obtaining the ff:
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.17

[ ]

T
1
1 T
2

=
C (t t 2 )dt1 dt 2 .
[
]
E M =
E
X
(
t
)
dt
=

and

M
2 T X 1
X

(2T )
2T
Note that the basic idea here is that for an ergodic random process, we can use the time average
of this process (which is also WSS) since this is equal to the ensemble average.

For example, an electrical signal X(t) modeled as a sample function of an ergodic process, X and
E[X2(t)] and other ensemble averages are measured and observed using normal instruments.
Average Power The average power of a WSS process X(t) is RX(0) = E[X 2(t)] .
Average Power The average power of a WSS sequence Xn is RX(0) = E[Xn2] .
Instantaneous power across a resistor R is either obtained as v2(t)/R = i2(t)R . Here it is
implicitly assumed that R = 1 . When x(t) is used to represent the voltage or current then the
instantaneous power is then expressed as x2(t). The random variable X 2(t) is then referred to as
the instantaneous power of the process X(t).
If X(t) is used to model a voltage, the time average of the sample function x(t) over an interval 2T
is given by:
1 T
X (T ) =
x(t ) dt .
2T T
This is the DC voltage of x(t) which can be measured by a voltmeter.
The time average of the power of a sample function is then
1
X (T ) =
2T
2

1st Semester 2012-2013

(t ) dt .

EE 214
Probability and Random Processes

Slide No.18

Gaussian Processes
As discussed in the topic on Central Limit theorem, there is the prevalence of natural phenomena
that can be modeled sufficiently by Gaussian random variables. This also holds true for stochastic
processes, i.e., a lot of natural random processes can be modeled as Gaussian random processes.
Gaussian Process X(t) is a Gaussian random stochastic process if and only if X = [ X (t1 ) L X (t k )]
is a Gaussian random vector for any integer k > 0 and any set of time instants t1 , t 2 , K ,.t k
Gaussian Sequence Xn is a Gaussian random sequence if and only if X = [ X n1 L X n k ]
is a Gaussian random vector for any integer k > 0 and any set of time instants n1 , n2 , K ,. nk
Gaussian Random Vector X is a Gaussian (X, CX) random vector with expected value X and
covariance CX if and only if
1
1

1
f X ( x) =
exp
(x X )C X (x X ) ,
1/ 2
n/2
(2 ) [det (C X )]
2

where det(CX) > 0.


Theorem 7.6
If X(t) is a WSS Gaussian process, then X(t) is a stationary (SSS) Gaussian process.
If Xn is a WSS Gaussian sequence, then Xn is a stationary (SSS) Gaussian sequence.
White Gaussian Noise W(t) is a white Gaussian noise process iff W(t) is a stationary Gaussian
stochastic process with properties W = 0 and RW ( ) = 0 (t ).
1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.19

The white Gaussian noise process is a mathematical model and cannot be reproduced by any
electrical signal:
E[W 2 (t )] = Rw (0) = .
That is, white noise has infinite power, a physical impossibility. The model is useful in the sense
that any Gaussian noise signal observed in practice is interpreted as a filtered white Gaussian
noise signal with finite power.
Example 7.4: X(t) is a stationary Gaussian random process with X (t ) = 0 and autocorrelation
function R X ( ) = 2 | | . What is the PDF of X(t) and X(t+1) .
From the problem statement we obtain the ff:
E[ X (t )] = E[ X (t + 1)] = 0;

1 1
R X (t , ) = R X ( ) = E[ X (t ) X (t + )] = 2 = =
2
2
= 1, E[ X (t ) X (t + 1)] = 1 / 2 , Var[ X (t )] = Var[ X (t + 1)] = 1
| |

The joint PDF of X(t) and X(t+1) is the Gaussian vector PDF,
(x 0 x 0 x1 + x1 )
1
1
1 1
3
f X (t ), X (t +1) ( x0 , x1 ) =
exp

x
C
x
=
exp

X
n/2
1/ 2
3
( 2 ) [det(C X )]
2

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.20

where:

1 1 / 2
CX =
,
1 / 2 1

C 1
X =

1 / 2
4 1
,
1
3 1 / 2

1 / 2 x0 4 2
4 1
xC X1x = [x0 x1 ]
= x0 x0 x1 + x12

1 x1 3
3 1 / 2

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.21

1st Semester 2012-2013

EE 214
Probability and Random Processes

Slide No.22

Das könnte Ihnen auch gefallen