Sie sind auf Seite 1von 39

Review of

Random Variables
for Communications
EN 2072
Semester 4 May 2011
Prof. Dileeka Dias
Department of Electronic & Telecommunication
Engineering
University of Moratuwa

Contents
Introduction
Random Variables
Discrete
Continuous

Joint Random Variables


Discrete
Continuous

Functions of Random Variables


Single Random Variable
Mean
Variance
Moments

Two Random variables


Correlation and Covariance

Random Variables: Definition


Outcome of a random experiment may be
A numerical value (1, 2, 3, 4, 5, 6)
Described by a phrase (Heads, Tails)

From a mathematical point of view it is preferable to have a


numerical value (real number) assigned to each sample
point according to some rule.
If there are m sample points
, we assign a real
number x( ) to each sample point.
X( ) is the function that maps sample points into real
numbers x1, x2, .. xm
X is a random variable which takes on values
x1, x2, .. xm.

Random Variables: Definition

Random Variables: Definition


Example 1
Heads 1
Tails 0

Example 2
1 10
2 20
3 30
4 40
5 50
6 60

X = 1 or 0 , each with probability 1/2

1, i " Heads"
xi
0, i "Tails "

X = 10, 20, 30 .60, each with


probability 1/6

xi 10 i

Random Variables: Discrete


A discrete random variable maps events to
values of a countable set (e.g., the integers),
with each value in the range having
probability greater than or equal to zero.
A discrete random variable is described by a
discrete proability density function (probablity
mass function)
PX ( xi ), i 1, 2, .......m

X ( xi ) 1

where

Random Variables: Discrete


Discrete Probability Density Function (PDF) or
Probability Mass Function (PMF) and Discrete
Cumulative Distribution Function (CDF)

Discrete CDF:

Discrete PDF:

1
6 , x 1
3

PX ( xi ) , x 2
6
2 , x 4
6

FX ( x) P( X x)
x 1
0,
1
, 1 x 2
6
4
, 2 x4
6
1,
x4

Random Variables: Discrete


Example: The Poisson random variable
A random variable X is said to have a Poisson
e k
distribution if P( X k )
, k 0, 1, 2, 3 ......
k!
Where is called average or the expected value

PDF

CDF

Random Variables: Discrete


Example:
Telephone calls arriving at a
switch, page view requests to a
website are examples of
Poisson processes
The probability that there are k
incoming calls during the time t
and t+ is given by

e( ) k
PN (t ) N (t ) k
k!
Where is the average call
arrival rate

Random Variables: Continuous


A random variable is called continuous if it can
assume all possible values in the possible
range of the random variable.
The continuous random variable X is described
by the (continuous) probability density
function.

Random Variables: Continuous


It is denoted by f X (x) , the probability that the
random variable X takes the value between
x and x+ x where is a very small change
in X. f X ( x) P( x X x x)

Random Variables: Continuous


If there are two points a and b then the
probability that the random variable will take
the value between a and b is given by

f X ( x)dx 1

Random Variables: Continuous


The CDF of a continuous random variable is
x
given by:
FX ( x)

fU (u )du

f X ( x) d FX ( x)
dx

Random Variables: Continuous


The Gaussian Random Variable
f X ( x)

( x ) 2

1
2

2 2 ,

N (0,1) Standard Normal Distribution

N ( , 2 )
Mean:
Variance: 2

Source: Wikipedia

Random Variables: Continuous

The Gaussian Random Variable


x

FX ( x)

X (u ) du

x (u ) 2

2 2 du

2 2

0 (u ) 2

2 2 du

2 2

u
2

1
1

2 2 du

2 2 0

x /

x (u ) 2

2 2
t 2

e
0

1 1
x

erf

2 2
2
2

dt

erf x

t 2

dt

Random Variables: Continuous

The Gaussian Random Variable

1 1
x

FX ( x) erf

2 2
2
2
x
1 Q

erf x

t 2

dt

x
1

Qx 1 erf

2
2

Random Variables: Continuous


Example:
Transmitted pulses
1 p(t)
0 -p(t)

Received signal

The received signal is


Sampled at the peak point
And compared with a threshold
of 0
The sample consists of a contribution
from the signal and a contribution from
the AWGN that corrupts the channel.
The sample value can be
Ap + n
-Ap + n

n is a sample value of the random variable with zero mean

Random Variables: Continuous

Probability of error
P(e|0) = P(-AP + n) > 0
= P(n > Ap )
P(e|1) = P(AP + n) < 0
= (n< - AP)

P(n< - AP)

P(n > Ap )

Random Variables: Continuous


For a Gaussian Random variable
x
F ( x) 1 Q
P( X x)
X

n
n
P( X n) 1 1 Q
Q

A
P(e | 0) P(n AP ) Q P

A
P(e | 1) P(n AP ) Q P

Two (Joint) Random Variables: Discrete


If X and Y are two discrete random variables, the
conditional probability of xi and yj is given by
PX |Y ( xi | y j )

X |Y ( xi | y j )

Conditional
Probability

Y | X ( y j | xi ) 1

PXY ( xi , y j ) PX |Y ( xi | y j ) PY ( y j ) PY | X ( y j | xi ) PX ( xi )
Joint Probability

Using

Joint Random Variables : Discrete


PXY ( xi , y j ) PX |Y ( xi | y j ) PY ( y j ) PY | X ( y j | xi ) PX ( xi )

XY ( xi , y j )

X |Y ( xi | y j ) PY ( y j )

PY ( y j )

X |Y ( xi | y j )

PY ( y j )
Similarly

XY ( xi , y j ) PX ( xi )

Marginal Probabilities

Joint Random Variables : Discrete


PXY ( xi , y j ) PX |Y ( xi | y j ) PY ( y j ) PY | X ( y j | xi ) PX ( x j )
X and Y are independent if
PX |Y ( xi | y j ) PX ( xi ) or
PY | X ( y j | xi ) PY ( y j )
Hence,
PXY ( xi , y j ) PX ( xi ) PY ( y j )

Joint Random Variables : Discrete


Example: A BSC has error probability p. The
probability of transmitting a 1 is a and the
probability of transmitting a 0 is 1-a.
Determine the probabilities of receiving a 1
and a 0 at the receiver.
X: RV indicating the input
Y: RV indicating the output
x1 = 1, x2 = 0
PX (1 ) = a, PX (0 ) = 1-a

Joint
Random
Variables
:
Discrete
y =0
x 0
2

2=

PY (0 ) = ??

PX (0 ) = 1-a
x1 = 1
PX (1 ) = a

y1 = 1
PY (1 ) = ??
Probability of error

PY | X (0 | 1) PY | X (1 | 0) p
PY | X (0 | 0) PY | X (1 | 1) 1 p

PY ( y j )

XY ( xi , y j )

Probability of
correct reception

Y | X ( y j | xi ) PX ( xi )

for j 1
PY (1) PY | X (1 | 1) P(1) PY | X (1 | 0) P(0) (1 p)a p (1 a)
( p a) 2 pa
for j 2
PY (0) PY | X (0 | 1) P(1) PY | X (0 | 0) P(0) pa (1 p)(1 a)
1 ( p a) 2 pa 1 PY (1)

Joint Random Variables: Continuous


Joint CDF of continuous random variables
FXY ( x, y) P( X x and Y y)

Joint PDF of continuous random variables


2
f XY ( x, y )
FXY ( x, y)
x y

x2 y 2

P( x1 X x2 , y1 Y y2 )

x1 y1

f XY ( x, y )dxdy 1

f XY ( x, y )dxdy

Joint Random Variables : Continuous


Conditional Densities for continuous random
variables
Joint
PDFs

f XY ( x, y ) f X |Y ( x | y ) fY ( y )
f XY ( x, y ) fY | X ( y | x) f X ( x)

Conditional PDFs

Hence,
f X |Y ( x | y )

fY | X ( y | x ) f X ( x )
fY ( y )

Bayes Rule for


Continuous RVs

X and Y are independent if


f X |Y ( x | y ) f X ( x) or
Which means that
fY | X ( y | x) fY ( y )

f XY ( x, y) f X ( x) fY ( y)

Functions of a Random Variable


The Mean (Expected Value)
Example: for a Gaussian RV,

E[ X ]

( x ) 2

xe

2 2 dx

( y) 2

( y )e

xf

X ( x)dx

or

x P( x )
i

( y ) 2

2 2 dy
( y) 2

2 2 dy e
ye
2
Odd function

of y

DC value of a
signal

E[ X ] X

2 2
Let x y
E[ X ]

__

2 2 dy

Functions of a Random Variable


The Mean (Expected Value)
Let Y = g(X)
__

E[Y ] Y

A function of a RV

g ( x) f

X ( x) dx

or

g ( x ) P( x )
i

Therefore:

Mean Square Value E[ X 2 ] x 2 f X ( x)dx

Functions of a Random Variable


The Mean (Expected Value)
Let Y = g(X)
__

E[Y ] Y

A function of a RV

g ( x) f

X ( x) dx

or

g ( x ) P( x )
i

Therefore:

E[ X 2 ]

x 2 f X ( x)dx

E[ X 2 ]

Mean Square Value


of a signal

Root Mean Square


(RMS) value of a signal

Functions of a Random Variable


Example

A sinusoidal signal is given by A cost. This is


sampled at random time instants. The sampled
output is a random variable X.
Find E[X] and E[X2]

The sampling instant is a random variable T.


Let be another rv.
~ U 0,2

P(q

Uniform Distribution

X A cos

1/2

Functions of a Random Variable


Example contd.

X A cos
2

E[ X ]

1
dq 0
2

A cosq

0
2
2

E[ X ]

1
A cos q
dq
2
2

A2 A2

2
2

Functions of a Random Variable


Variance

2
2

__
_

2
Variance( X ) X
E X X x X f X ( x)dx

__
Std. Deviation ( X ) E X X

Average AC power
of a signal

Functions of a Random Variable


Variance

__
X2 E X X

__ __ 2
EX 2 2X X X

__ 2

2
E[ X ] 2 E[ X ] X E X

__

__ 2

E[ X 2 ] X

E[ X 2 ] E[ X ]2

Functions of a Random Variable


Moments
The nth moment of X is given by:

__
n

E[ X n ] X

x n f X ( x)dx

The nth central moment of X is given by:

__

n
E[( X X ) ] x x f X ( x)dx

Functions of two Random Variables


Correlation and Covariance
Estimate the nature of dependence between two
random variables
Correlation

R XY E[ XY ]

xyf

XY ( x, y )dxdy

If X and Y are independent:

R XY

xf

X ( x)dx

yf ( y)dy E[ X ]E[Y ]
Y

X and Y are said to be orthogonal if


RXY E[ XY ] 0

X Y

Functions of two Random Variables


Correlation and Covariance
Covariance

__
__
__
__

C XY XY E X X Y Y x X y Y f XY ( x, y )dxdy

__
__

C XY XY E X X Y Y

__

__

__ __

E[ XY ] X E[Y ] Y E[ X ] X Y

__ __

E[ XY ] E[ X ]E[Y ] R XY X Y

Correlation can be
Positive
Zero or
Negative
For Independent RVs
__ __
R XY E[ X ]E[Y ] X Y

C XY 0

Functions of two Random Variables


Correlation and Covariance
Covariance

__ __

C XY RXY X Y
Correlation can be
Positive
Zero or
Negative

For Independent RVs


__ __

R XY E[ X ]E[Y ] X Y
C XY 0

X and Y are uncorrelated

Functions of two Random Variables


Correlation and Covariance

X and Y are positively correlated

Scatter
Plots

X and Z are negatively correlated

X and W are uncorrelated

Functions of two Random Variables


Relationship between Independence and
Correlatedness

Correlation Coefficient
XY

C XY

XY

1 XY 1

Das könnte Ihnen auch gefallen