Sie sind auf Seite 1von 20

Chapter 4:

Random Processes

Dr. Barış Atakan

1
Random Variables

• Random Variables: A random variable is a mapping from the sample


space Ω to the set of real numbers.

• Random variables are denoted by capital letters X, Y, etc.; individual values of


the random variable X are X(ω).

• A random variable is discrete if the range of its values is finite. This range is
usually denoted by {xi }.

2
Random Variables

• The cumulative distribution function (CDF) of a random variable X is defined


as

or

1. 0  FX ( x)  1
2. FX ( x) is nondecreasing
3. lim FX ( x)  0 and lim FX ( x)  1
x   x  

4. FX ( x) is continuous from the right, i.e., FX (a  )  FX (a), a   lim a  


0   0
5. P(a  X  b)  FX (b)  FX (a)
6. P( X  a)  1  FX (a)
7. P( X  b)  FX (b  ) where b   lim b  
0   0

8. P( X  a)  FX (a)  FX (a  )
3
Random Variables

• For discrete random


variables FX (x) is a stair-
case function

• A random variable is called


continuous if FX (x) is a
continuous function

4
Random Variables
• The probability density function (PDF) of a random variable X is
defined as the derivative of FX (x); i.e.,

• In case of discrete random variables, the PDF involves


impulses

1. f X  0 • For discrete random variables, it is


 more common to define the
2.  f ( x)dx  1
-
X
probability mass function (PMF),
which is defined as {pi } where
b pi =P(X = xi). Obviously for all i we
3.  f ( x)dx  P(a  X  b)
X have pi ≥ 0 and ∑i pi = 1
a
x
4. F ( x)   f (u )du
X X
 5
Random Variables: Important Random Variables
• Bernoulli Random Variable: A discrete random variable taking two values
one and zero with probabilities p and 1 − p
• A Bernoulli random variable is a good model for a binary data generator.
• When binary data is transmitted over a communication channel, some bits
are received in error
• A Bernoulli random variable can be employed to model the channel errors

6
Random Variables: Important Random Variables
• Binomial Random Variable: A discrete random variable giving the
number of 1’s in a sequence of n independent Bernoulli trials. The PMF is

• For example, the total number of bits received in error when a sequence
of n bits is transmitted over a channel with bit-error probability
of p

7
Random Variables: Important Random Variables
• Uniform Random Variable: A continuous random variable taking values
between a and b with equal probabilities over intervals of equal length. The
density function is given by

• For example, when the phase of a sinusoid is random it is usually modeled as


a uniform random variable between 0 and 2π

8
Random Variables: Important Random Variables
• Gaussian or Normal Random Variable: A continuous random variable
described by the density function

• The most important and frequently encountered random variable in


communications . The reason is that thermal noise, which is the major
source of noise in communication systems has a
Gaussian distribution

9
Random Variables: Important Random Variables
• The CDF for the Gaussian random variable with m = 0 and σ = 1 is denoted
by Φ(x) and given by

• A closely related function is Q(x) = 1−Φ(x) giving P(X > x). This function is
frequently used in communications.

10
Random Variables: Important Random Variables
• There exist certain bounds on the Q function that are widely used to find
bounds on error probability of various communication systems.

 t 2 / 2


e
Q( x)  dt
x 2
 x2 / 2  x2 / 2
 1 e e
1  2   Q( x)  , for x  0
 x  x 2 x 2

• A Gaussian variable can be described in terms of its two parameters m and


σ by N(m,σ2). For this random variable a simple change of variable in the
integral that computes P(X > x) results in P(X > x) = Q( x−m/σ ). This gives
the so-called tail probability in a Gaussian random variable.

11
Random Processes

• Random process: A collection of


signals, corresponding to
various outcomes of a random
experiment
• Example: Consider the random
experiment of throwing a die.
In this case, Ω={1,2,3,4,5,6}.
For all ωi, let x(t,ωi)=ωi e-tu-1(t).
Then X(1) is a random variable
taking values e-1,2e-1,…,6e-1
each with probability 1/6.

12
Random Processes

• Alternatively, we may view the random signal at t1, t2, . . . , or in


general, all t ∈ R as a collection of random variables {X(t1), X(t2), . . .}, or
in general, {X(t), t ∈ R}.
• From this viewpoint, a random process is represented as a collection of
random variables indexed by some index set

• Continuous-time random process: The index set is the set of real numbers
• Discrete-time random process: The index set is the set of all integers
Example: Let ωi denote the outcome of a random experiment consisting of
independent drawings from a Gaussian random variable distributed
according to N(0,1). Let the discrete-time random process Xn, n={0,1,…} be
defined by X0 and Xn=Xn-1+ωn for n ≥ 1.

13
Random Processes: Statistical Averages
• At any given time the random process defines a random variable
• The mean, or expectation of the random process X(t) is a deterministic
function of time mX(t) that at each time instant t0 equals the mean of the
random variable X(t0). That is, mX (t) = E[X(t)] for all t.
• Since at any t0, the random variable X(t0) is well defined with a PDF
fX(t0)(x), we have

• Example: X(t) = A cos(2πf0t+Θ) where Θ is a random variable uniformly


distributed on [0,2π).

14
Random Processes: Statistical Averages
• The autocorrelation function of the random process X(t), denoted as
RXX(t1, t2), is defined by RXX(t1, t2) = E[X(t1)X(t2)]

• It is usually denoted by RX (t1, t2) for short

• Example: For X(t) = A cos(2πf0t+Θ)

15
Random Processes: Stationarity
• A strictly stationary process is a process in which
for all n and all (t1, t2, . . . , tn), fX(t1),X(t2),...,X(tn )(x1, x2, . . . , xn)
depends only on the relative position of (t1, t2, . . . , tn), and not on their
values directly.

• In other words, a shift in the time origin does not change


the statistical properties of the process

• A strictly stationary process is a process in which for all n, all (t1, t2, . . . ,
tn), and all Δ

• A process is Mth order stationary if the above condition holds for all n ≤ M.

16
Random Processes: Stationarity
• Aprocess X(t) is wide-sense stationary (WSS) if the following
conditions are satisfied:
1. mX(t) = E[X(t)] is independent of t
2. RX(t1, t2) depends only on the time difference τ =t1 −t2 and not on t1
and t2 individually.

• Example: For X(t) = A cos(2πf0t+Θ),

17
Random Processes: Power and Energy
• The power content PX of the random process X(t) are defined as
If the process is stationary

18
Random Processes: Power and Energy
• The energy content ΕX of the random process X(t) are defined as

If the process is stationary

19
Random Processes: Power and Energy

• Example: For X(t) = A cos(2πf0t+Θ)

• The power spectral density of a stationary process X(t) can be defined as


follows

20

Das könnte Ihnen auch gefallen