Sie sind auf Seite 1von 123

UNIT - I

RANDOM VARIABLES

Introduction
Consider an experiment of throwing a coin twice. The outcomes {HH, HT, TH, TT}
consider the sample space. Each of these outcome can be associated with a number by
specifying a rule of association with a number by specifying a rule of association (eg. The
number of heads). Such a rule of association is called a random variable. We denote a
random variable by the capital letter (X, Y, etc) and any particular value of the random
variable by x and y.
Thus a random variable X can be considered as a function that maps all elements in
the sample space S into points on the real line. The notation X(S)=x means that x is the
value associated with the outcomes S by the Random variable X.
1.1 SAMPLE SPACE
Consider an experiment of throwing a coin twice. The outcomes S = {HH, HT, TH, TT}
constitute the sample space.
1.2 RANDOM VARIABLE
In this sample space each of these outcomes can be associated with a number by
specifying a rule of association. Such a rule of association is called a random variables.
Eg : Number of heads
We denote random variable by the letter (X, Y, etc) and any particular value of the
random variable by x or y.
S = {HH, HT, TH, TT} X(S)
= {2, 1, 1, 0}
Thus a random X can be the considered as a fun. That maps all elements in the sample space S
into points on the real line. The notation X(S) = x means that x is the value associated with
outcome s by the R.V.X.
Example
In the experiment of throwing a coin twice the sample space S is S = {HH,HT,TH,TT}.
Let X be a random variable chosen such that X(S) = x (the number of heads).
Note
Any random variable whose only possible values are 0 and 1 is called a Bernoulli random
variable.
1.2.1 DISCRETE RANDOM VARIABLE
Definition : A discrete random variable is a R.V.X whose possible values consitute finite set of
values or countably infinite set of values.
Examples
1

All the R.V.s from Example : 1 are discrete R.Vs


Remark
The meaning of P(X a).
P(X a) is simply the probability of the set of outcomes S in the sample space for which
X(s) a.
Or
P(Xa) = P{S : X(S) a}
In the above example : 1 we should write

P(X 1) = P(HH, HT, TH) = 4

Here P(X1) = 4 means the probability of the R.V.X (the number of heads) is less than or equal

to 1 is 4 .
Distribution function of the random variable X or cumulative distribution of the random
variable X
Def :
The distribution function of a random variable X defined in (-, ) is given
by F(x) = P(X x) = P{s : X(s) x}
Note
Let the random variable X takes values x1, x2, .., xn with probabilities P1, P2, .., Pn
and let x1< x2< .. <xn
Then we have
F(x) = P(X < x1) = 0, - < x < x,
F(x) = P(X < x1) = 0, P(X < x1) + P(X = x1)
= 0 + p 1 = p1
F(x) = P(X < x2) = 0, P(X < x1) + P(X = x1) + P(X = x2) = p1 + p2
F(x) = P(X < xn) = P(X < x1) + P(X = x1) + .. + P(X = xn)
= p1 + p2+ . + pn
=1
1.2.2 PROPERTIES OF DISTRIBUTION FUNCTIONS
Property : 1
P(a < X b) = F(b) F(a), where F(x) = P(X x)
Property : 2
P(a X b) = P(X = a) + F(b) F(a)
Property : 3
P(a < X < b) = P(a < X b) - P(X = b)
= F(b) F(a) P(X = b)
by prob (1)
1.2.3 PROBABILITY MASS FUNCTION (OR) PROBABILITY FUNCTION
Let X be a one dimenstional discrete R.V. which
takes the values
x1, x2, To each possible outcome xi we can associate a number pi.
i.e., P(X = xi)
= P(xi) = pi
called the probability of xi. The number
pi = P(xi) satisfies the following conditions.
(i) p(xi ) 0, i

(ii) p(x i ) = 1
i =1

The function p(x) satisfying the above two conditions is called the probability mass
function (or) probability distribution of the R.V.X. The probability distribution {x i, pi} can be
displayed in the form of table as shown below.
X = xi

x1

x2

xi

P(X = xi) = pi

p1

p2

pi

Notation
Let S be
a sample space.
The set of all outcomes
X(S) = x is denoted by writing X = x.
P(X = x)
= P{S : X(s) = x}
= P{S : X() (-, a)}
|||ly P(x a)
= P{s : X(s) (a, b)}
and P(a < x b)
= P{(X = a) (X = b)}
P(X = a or X = b)
P(X = a and X = b) = P{(X = a) (X = b)}
and so on.

S in S such that

Theorem :1 If X1 and X2 are random variable and K is a constant then KX 1, X1 + X2, X1X2,
K1X1 + K2X2, X1-X2 are also random variables.
Theorem :2
If X is a random variable and f() is a continuous function, then f(X) is a random
variable.
Note
If F(x) is the distribution function of one dimensional random variable then
I. 0 F(x) 1
AI. If x < y, then F(x) F(y)
III. F(-) = lim F(x) = 0
x

IV. F() = lim F(x) = 1


x1, x2, x3
x
V. If X is a discrete R.V. taking values
Where x1< x2< xi-1 xi .. then
P(X = xi) = F(xi) F(xi-1)
Example:1.2.1
A random variable X has the following probability function
Values of X
0 1 2 3 4 5
6
Probability P(X)

a 3a 5a 7a 9a 11a 13a 15a 17a

(i) Determine the value of a


(ii) Find P(X<3), P(X3), P(0<X<5)
(iii)
Find the distribution function of X.
Solution
3

Table 1
Values of X

0 1

p(x)

a 3a 5a 7a

4 5

9a 11a 13a

15a 17a

(i) We know that if p(x) is the probability of mass function then


8

p(x i ) = 1
i =0

p(0) + p(1) + p(2) + p(3) + p(4) + p(5) + p(6) + p(7) + p(8) = 1


a + 3a + 5a + 7a + 9a + 11a + 13a + 15a + 17a
=
81 a
=
a
=
put a = 1/81 in table 1, e get table 2
Table 2
X=x 0
1
2
3
4
5
6
7
P(x)

1/81 3/81

1
1
1/81

5/81 7/81 9/81 11/81 13/81 15/81 17/81

(ii) P(X < 3)

= p(0) + p(1) + p(2)


= 1/81+ 3/81 + 5/81 = 9/81
(ii) P(X 3)
= 1 - p(X < 3)
= 1 - 9/81
= 72/81
(iii) P(0 < x < 5)
= p(1) + p(2) + p(3) + p(4) here 0 & 5 are not include
= 3/81 + 5/81 + 7/81 + 9/81
3+5+7+8+9
24
= =

81
81
(iv) To find the distribution function of X using table 2, we get
X = x F(X) = P(x x)
0
1
2
3
4

F(0)

= p(0) = 1/81

F(1)

= P(X 1) = p(0) + p(1)


= 1/81 + 3/81 = 4/81

F(2)

= P(X 2) = p(0) + p(1) + p(2)


= 4/81 + 5/81 = 9/81

F(3)

= P(X 3) = p(0) + p(1) + p(2) + p(3)


= 9/81 + 7/81 = 16/81

F(4)

= P(X 4) = p(0) + p(1) + . + p(4)


= 16/81 + 9/81 = 25/81
4

F(5)

= P(X 5) = p(0) + p(1) + .. + p(4) + p(5)


= 2/81 + 11/81 = 36/81

F(6)

= P(X 6) = p(0) + p(1) + .. + p(6)


= 36/81 + 13/81 = 49/81

F(7)

F(8)
p(8)

= P(X 7) = p(0) + p(1) + . + p(6) + p(7)


= 49/81 + 15/81 = 64/81
= P(X 8) = p(0) + p(1) + .. + p(6) + p(7) +

5
6

= 64/81 + 17/81 = 81/81 = 1

1.3 CONTINUOUS RANDOM VARIABLE


Def : A R.V.X which takes all possible values in a given internal is called a continuous random
variable.
Example : Age, height, weight are continuous R.V.s.
1.3.1 PROBABILITY DENSITY FUNCTION
Consider a continuous R.V. X specified on a certain interval (a, b) (which can also be a infinite
interval (-, )).
If there is a function y = f(x) such that
lim P(x < X < x + x) = f (x)
x
Then this function f(x) is termed as the probability density function (or) simply density function
of the R.V. X.
It is also called the frequency function, distribution density or the probability density
function.
The curve y = f(x) is called the probability curve of the distribution curve.
Remark
If f(x) is p.d.f of the R.V.X then the probability that a value of the R.V. X will fall in some
interval (a, b) is equal to the definite integral of the function f(x) a to b.
x 0

P(a < x < b)

= f (x) dx
(or)

a
b

P(a X b)

= f (x) dx
a

1.3.2 PROPERTIES OF P.D.F


The p.d.f f(x) of a R.V.X has the following properties

(i) f(x) 0, - < x <

(ii) f (x) dx = 1

Remark
5

1. In the case of discrete R.V. the probability at a point say at x = c is not zero. But in the case of
a continuous R.V.X the probability at a point is always zero.

P(X = c)

= f (x)dx = [x]C = C C = 0

2. If x is a continuous R.V. then we have p(a X b) = p(a X < b)


= p(a < X V b)
IMPORTANT DEFINITIONS INTERMS OF P.D.F
If f(x) is the p.d.f of a random variable X which is defined in the interval (a, b) then
b

x f (x)dx

Arithmetic mean

ii

Harmonic mean

a
b

iii

1
x

f (x) dx

log x f (x)dx

Geometric mean G log G

a
b

iv

xr f (x)dx

Moments about origin

a
b

(x A)r f (x)dx

Moments about any point A

a
b

vi

(x mean)r f (x)dx

Moment about mean r

a
b

vii

(x mean)2 f (x)dx

Variance 2

a
b

viii

Mean deviation about the mean is M.D.

| x mean | f (x)dx
a

1.3.3 Mathematical Expectations


Def :Let X be a continuous random variable with probability density function f(x).
Then the mathematical expectation of X is denoted by E(X) and is given by

E(X) = x f (x)dx

It is denoted by

'r

= xr f (x)dx

Thus
6

'

= E(X)

'2
= E(X 2 )
Mean = X = 1' = E(X)

(1' about origin)


('2 about origin)

And

Variance

= '2 '22

Variance

= E(X2 ) [E(X)]2

th

* r moment (abut
mean) Now

(a)

E{X E(X)}r

{x E(X)}r f (x)dx

{x X}r f (x)dx

Thus

{x X}r f (x)dx
(b)

Where

E[X E(X)r ]

th

This gives the r moment about mean and it is denoted by r


Put r = 1 in (B) we get

{x X}f (x)dx

= x f (x)dx

x f (x)dx

X X f (x)dx

X X

Put r = 2 in (B), we get

(x X)2 f (x)dx

Variance = 2

E[X E(X)]2

Which gives the variance interms of expectations.


Note
Let g(x) = K (Constant), then
7

f (x)dx =1

E g (X) = E (K) =

K f (x)dx

K f (x)dx

f (x)dx =1

=
K.1
=
Thus E(K) = K E[a constant] = constant.
1.3.4 EXPECTATIONS (Discrete R.V.s)
Let X be a discrete random variable with P.M.F p(x)
Then

E(X) =

x p(x)
x

For discrete random variables X

E(X r ) =

x r p(x)

(by def)

If we denote

E(Xr ) =

'r

Then

'r

E[X r ] =

x r p(x)
x

Put r = 1, we get

Mean 'r

x p(x)

Put r = 2, we get

'2

E[X 2 ] =

'2

= E(X ) {E(X)}

'

=
2

th

x 2p(x)
x

The r moment about mean

'r

E[{X E(X)}r ]
(x X)r p(x),
=

E(X) = X

Put r = 2, we get
Variance = 2 = (x X)2 p(x)
x

1.3.5 ADDITION THEOREM (EXPECTATION)


Theorem 1
If X and Y are two continuous random variable with pdf fx(x) and fy(y) then E(X+Y) =
E(X) + E(Y)
8

1.3.6 MULTIPLICATION THEOREM OF EXPECTATION


Theorem 2
If X and Y are independent random variables,
Then E(XY) = E(X) . E(Y)
Note :
If X1, X2, , Xn are n independent random variables, then
E[X1, X2, , Xn] = E(X1), E(X2), , E(Xn)
Theorem 3
If X is a random variable with pdf f(x) and a is a constant, then
(i) E[a G(x)] = a E[G(x)]
(ii) E[G(x)+a] = E[G(x)+a]
Where G(X) is a function of X which is also a random variable.
Theorem 4
If X is a random variable with p.d.f. f(x) and a and b are constants, then
E[ax + b] = a E(X) + b
Cor 1:
If we take a = 1 and b = E(X) = X , then we get
E(X- X ) = E(X) E(X) = 0
Note
E

E(X)
X
E[log (x)] log E(X)
2
2
E(X ) [E(X)]
1.3.7 EXPECTATION OF A LINEAR COMBINATION OF RANDOM VARIABLES
Let X1, X2, , Xn be any n random variable and if a1, a2 , , an are constants, then
E[a1X1 + a2X2 + + anXn] = a1E(X1) + a2E(X2)+ + anE(Xn)
Result
If X is a random variable, then
2
Var (aX + b) = a Var(X) a and b are constants.
Covariance :
If X and Y are random variables, then covariance between them is defined as
Cov(X, Y) = E{[X - E(X)] [Y - E(Y)]}
= E{XY - XE(Y) E(X)Y + E(X)E(Y)}
Cov(X, Y)
=
E(XY) E(X) . E(Y)
(A)
If X and Y are independent, then
E(XY)
= E(X) E(Y)
Sub (B) in (A), we
get Cov (X, Y) = 0
If X and Y are independent, then
9

Cov (X, Y) = 0
Note
(i)
Cov(aX, bY)
= ab Cov(X, Y)
(ii)
Cov(X+a, Y+b)
= Cov(X, Y)
(iii)
Cov(aX+b, cY+d)
= ac Cov(X, Y)
(iv)
Var (X1 + X2) = Var(X1) + Var(X2) + 2 Cov(X1, X2)
If X1, X2 are independent
Var (X1+ X2) = Var(X1) + Var(X2)
EXPECTATION TABLE
Discrete R.Vs

Continuous R.Vs

1. E(X) = x p(x)

1. E(X) = x f (x) dx

r
'
2. E(X ) = r = x r f (x) dx

r
'
r
2. E(X ) = r = x p(x)
x

'
3. Mean = r = x p(x)

'
3. Mean = r =

x f (x) dx

4.

'
2

'
4. 2 =

= x 2 p(x)

x f (x) dx

'

'2

5. Variance = = E(X ) {E(X)}

'
'2
5. Variance = = E(X ) {E(X)}
2

SOLVED PROBLEMS ON DISCRETE R.VS


Example :1
2
When die is thrown, X denotes the number that turns up. Find E(X), E(X ) and Var
(X).
Solution
Let X be the R.V. denoting the number that turns up in a die.
X takes values 1, 2, 3, 4, 5, 6 and with probability 1/6 for each
X=x
p(x)
Now

1/6

1/6

1/6

1/6

1/6

1/6

p(x1)

p(x2)

p(x3)

p(x4)

p(x5)

p(x6)

E(X) =
=
=
=

x i p(x i )
i =1

x1p(x1) + x2p(x2) + x3p(x3) + x4p(x4) + x5p(x5) + x6p(x6)


1 x (1/6) + 1 x (1/6) + 3 x (1/6) + 4 x (1/6) + 5 x (1/6) + 6 x (1/6)
21/6
=
7/2
(1)
10

E(X) =
=
=

x i p(x p )
i =1

1 + 4 + 9 + 16 + 25 + 36

=
Variance (X)

x1 p(x 1)+x2 p(x2 )+x3 p(x 3)+x4 p(x 4)+x5 p(x 5)+x6p(x6) 1(1/6) + 4(1/6) +
9(1/6) + 16(1/6) + 25(1/6) + 36(1/6)

91
=

6
2

= Var (X) = E(X ) [E(X)]

1 1
2
C
3 =1

C32=16

(2)

91

72

91
6

49

35

12

Example :2
Find the value of (i) C (ii) mean of the following distribution given

f (x) =

C(x x

),

0<x<1
otherwise

Solution
Given f (x) =

C(x x

),

0<x<1
otherwise

(1)

f (x) dx =1

C(x x 2 ) dx =1
0
2

[using (1)] [ 0<x<1]

=1

30

C =1

C=6
6
2
Sub (2) in (1), f(x) = 6(x x ), 0< x < 1

(2)
(3)

Mean

= E(x) = x f (x) dx

= x 6(x x 2 ) dx
0

= (6x

[from (3)]

x 3 ) dx

11

[ 0 < x < 1]

= 6x 6x

Mean =
Mean

1.4 CONTINUOUS DISTRIBUTION FUNCTION


Def :
If f(x) is a p.d.f. of a continuous random variable X, then the function

FX(x) = F(x) = P(X x) = f (x) dx, < x <

is called the distribution function or cumulative distribution function of the random


variable.
* PROPERTIES OF CDF OF A R.V. X
(i) 0 F(x) 1, - < x <
Lt F(x) = 0,
Lt F(x) = 1
(ii)
x

x
b

(iii)

P(a X b) = f (x) dx = F(b) F(a)


a

(iv)

F'(x) = dF(x)
dx

f(x) 0

(v)
P(X = xi) = F(xi) F(xi 1)
Example :1.4.1
Given the p.d.f. of a continuous random variable X follows
f (x) =

6x(1 x),

Solution
Given f (x) =

0<x<1

, find c.d.f. for X

otherwise
6x(1 x),
0<x<1
0
x

otherwise

The c.d.f is F(x) = f (x) dx , < x <

(i) When x < 0, then


x

F(x)

= f (x) dx

= 0 dx

=0

(ii) When 0< x < 1, then


12

F(x)

= f (x) dx

0x

= f (x) dx + f (x) dx

0
x

30

= 0 + 6x(1 x) dx = 6 x(1 x) dx = 6 x x
0

(iii)

= 3x 2x
When x > 1, then

F(x) =

f (x) dx

= 0dx + 6x(1 x) dx + 0 dx

0
1

2
= 6 (x x ) dx

=1

Using (1), (2) & (3) we get

x<0

0,
F(x) = 3x 2 2x 3 ,

0<x<1
x >1

1,
Example:1.4.2
e x ,
(i) If f (x) = 0,

x 0

defined as follows a density function ?

x< 0

(ii) If so determine the probability that the variate having this density will fall in the interval (1,
2).
Solution
x0x<0
e x ,
Given f (x) = 0,
-x

(a) In (0, ), e is +ve f(x) 0 in (0, )

(b) f (x) dx = f (x) dx + f (x) dx

= 0dx + e

= e

dx

=e

+1

13

=1
Hence f(x) is a p.d.f
(ii) We know that
b

= f (x) dx

P(a X b)

a
2

= f (x) dx

P(1 X 2)

= e x dx = [ e x ]2+1

= e x dx = [ e x ]2+1
1

-2

= -e + e

-1

= -0.135 + 0.368

= 0.233

Example:1.4..3
-x
A probability curve y = f(x) has a range from 0 to . If f(x) = e , find the mean and
variance and the third moment about mean.
Solution

Mean

= x f (x) dx
0

= xe

dx

= x[ e ] [e

Mean = 1

Variance 2 = (x Mean) 2 f (x) dx


0

= (x 1) 2 e x dx
0

2 =1
Third moment about mean
b

3 = (x Mean) 3 f (x) dx
a

Here a = 0, b =

= (x 1) 3 e x dx
a

{(x 1) 3 ( e x ) 3(x 1) 2 (e x ) + 6(x 1)( e x ) 6(e x )}0

= -1 + 3 -6 + 6 = 2 3 =
2
1.5 MOMENT GENERATING FUNCTION
Def : The moment generating function (MGF) of a random variable X (about origin) whose
probability function f(x) is given by
MX(t)
= E[etX]
14

tx

e f (x)dx, for a continuous probably function

x=

e tx p(x), for a discrete probably function


x=

Where t is real parameter and the integration or summation being extended to the entire range of
x.
Example :1.5.1
th

Prove that the r moment of the R.V. X about origin is M X (t) =

Proof

tX

WKT MX(t) = E(e )


=E

2
3
1 + tX + (tX) + (tX) + .... + +

1!

2!

3!

(tX)r

+ ....

r!

t2

tr

r
= E[1] + t E(X) + 2! E(X 2 ) + ..... + r! E(X ) + ........

t2

M (t)
X

'

[using r = E(X r )]
th

t
t3
'
'
'
= 1 + t + + 3! + ..... + ' + ........
r! r
1 2!
2
3

Thus r moment = coefficient of


Note

tr

r!

t r '
r
r =0 r!

1. The above results gives MGF interms of moments.


2. Since MX(t) generates moments, it is known as moment generating function.
Example:1.5.2
Find 1' and '2 from MX(t)
Proof
WKT M

(t)
X

M (t)
X

tr
'

r! r
2
r
= ' + t ' + t ' + ..... + t '
r! r
0 1!
1 2!
2
r =0

(A)

Differenting (A) W.R.T t, we get

M ' (t)
X

3
= ' + 2t ' + t ' + .....
2! 2 3! 3
1

(B)

Put t = 0 in (B), we get

M ' (0)
X

= ' = Mean
1

'
Mean = M (0)
1

(or)

d
dt

(M X (t))
15

t =0

M X" (t)
Put

= '2 + t 3' + .......

t = 0 in (B)
M X " (0) = '2

In general 'r

(or)

2
d

dt

(M X (t))
t =0

(M X (t))

d
r

dt

t =0

Example :1.5.3
Obtain the MGF of X about the point X = a.

= E[e t (X a ) ]

Proof
The moment generating function of X about the point X = a is M X (t)
2
r
= E 1 + t(X a) + t (X a) 2 + .... + t (X a) r + ....
2!
r!

Formula
x
e

=1+

x x2
1! 2!
+

+ ...

t2
= E(1) + E[t(X a)] + E[

tr
(X a) 2 ] + .... + E[ r! (X a) r ] + ....

2!
r
= 1 + t E(X a) + t E(X a) 2 + .... + t E(X a) r + ....
r!
2!
2

[M X (t) ]x =a
Result:

=1+t' +
1

t2

2!

tr
' + .... + r! ' + ....
2

Where ' = E[(X a) r ]

= 1 + t ' + t ' + ..... + t ' + .....


r! r
1 2!
2
M CX (t) = E[e tcx ]
M X (t) = E[e ctx ]

(1)
(2)

From (1) & (2) we get

M CX (t) = M X (ct)
Example :1.5.4
If X1,

MX

+ X + ....+X
2

X2, .., Xn

(t)

= E[e
=

E[e

tX

t (X + X + ....+X )

.e
tX1

tX

are independent variables, then prove that

tX

.....e n ]
tX2

= E(e ).E(e ).....E(e tXn )


[ X1, X2, .., Xn are independent]
16

= M X (t).M X (t)..........M X (t)


1

Example:1.5.5
Prove that if
Proof
By definition

at

= X a , then M (t) = e h

.MX h ,

where a, h are constants.

h
M (t) = E e

tu

M X (t) = E[e

tx

t X a

=E

tX ta

=Een

n
ta

tX

= E[ e ] E[ e h ]
ta

=e

tat

tX

E[ e h ]

[by def]

= e h . MX h
at
= e h .MX

M (t)

, where =

Xa

Example:1.5.6
Find the MGF for the distribution where

2
3
1

f (x) =
0

at x = 1
at x = 2
otherwise

Solution

f (1) = 2
3

Given

f (2) = 3
f(3) = f(4) = = 0
MGF of a R.V. X is given by
17

and MX(t) is the MGF about origin.

= E[e tx ]

M X (t)
=
0

e txf (x)

=0

2t

= e f(0) + e f(1) + e f(2) + .


t
2t
= 0 +e f(2/3) + e f(1/3) + 0
t
2t
= 2/3e + 1/3e
t
t
MGF is M X (t) = e [2 + e ]

1.6 Discrete Distributions


The important discrete distribution of a random variable X are
1. Binomial Distribution
2. Poisson Distribution
3. Geometric Distribution
1.6.1
BINOMIAL DISTRIBUTION
Def : A random variable X is said to follow binomial distribution if its probability law is given
by
n-x
P(x) = p(X = x successes) = nCx px q
Where x = 0, 1, 2, ., n, p+q = 1
Note
Assumptions in Binomial distribution
i) There are only two possible outcomes for each trail (success or failure).
ii) The probability of a success is the same for each trail.
iii) There are n trails, where n is a constant.
iv) The n trails are independent.
Example :1.6.1
Find the Moment Generating Function (MGF) of a binomial distribution about origin.
Solution
WKT

M X (t) = e tx p(x)
x =0

Let X be a random variable which follows binomial distribution then MGF about origin is
given by

E[e tX ]

M X (t) = e tx p(x)

x =0
n

e
=
=

tx

nC x p

n x

p(x) = nC x p

x =0
n

(e tx ) p x nC xqn x
x =0
n

(pe t ) x nC xqn x

x =0

18

n x

M X (t)

= (q + pe t )n

Example:1.6.2
Find the mean and variance of binomial distribution.
Solution

M X (t)

= (q + pe t )n

M 'X (t)

= n(q + pe t ) n 1 .pet

Put t = 0, we get

= n(q + p) n 1.p

M 'X (0)

X
'

(q + p) = 1

Mean = E(X) = np
"

n 1

Mean M (0)
t n 2

= np (q + pe ) .e + e (n 1)(q + pe )

M X (t)

.pe

Put t = 0, we get
"

M X (t)

n 1

n 2

= np (q + p) + (n 1)(q + p)
= np [1 + (n 1)p]
= np + n 2 p 2 np2
= n 2 p 2 + np(1 p)

"

M (0)
M "X (0)

.p

= n p + npq
= E(X 2 ) = n 2 p 2 + npq

1p=q

Var (X) = E(X2 ) [E(X)]2 = n2 / p2 + npq n2 / p2 = npq


Var (X) = npq
S.D = npq
Example :1.6.3
Find the Moment Generating Function (MGF) of a binomial distribution about mean
(np).
Solution
Wkt the MGF of a random variable X about any point a is
t(X-a)
Mx(t) (about X = a)
= E[e
]
Here a is mean of the binomial distribution
t(X-np)
MX(t) (about X = np) = E[e
]
=

E[etX . e-tnp)]
-tnp
-tX)

= e . [-[e ]]
-tnp
t n
= e . (q+pe )-tp n
t n
= (e ) . (q + pe )
MGF about mean = (e-tp)n. (q + pet)n
Example :1.6.4
Additive property of binomial distribution.
Solution
19

The sum of two binomial variants is not a binomial variate.


Let X
and Y be two independent binomial
(n1, p1) and (n2, p2) respectively.

Then
X

(t)

q + p et

MX+Y (t)

,
)

+ pe

= q

(t)

= M X (t).MY (t) (
(1

t n1

. q

= q

variates with parameter


2

+ p et

)
n2

[ X & Y are independent R.V.s]


2

+ pe

t n

RHS cannot be expressed in the form q + pet n . Hence by uniqueness theorem of


MGF X+Y is not a binomial variate. Hence in general, the sum of two binomial variates is not a
binomial variate.
Example :1.6.5

If MX (t) = q+pet

)n

, MY (t) = (q+pet )n2 , then

MX +Y (t) = (q+pet )n1 +n2


Problems on Binomial Distribution
1. Check whether the following data follow a binomial distribution or not. Mean = 3; variance =
4.
Solution
Given
Mean np = 3
(1)
Variance npr = 4
(2)

(2)
(1)

np =
npq

q=

4
3

3
4

= 1 which is > 1.

Since q > 1 which is not possible (0 < q < 1). The given data not follow binomial distribution.
Example :1.6.5
The mean and SD of a binomial distribution are 5 and 2, determine the distribution.
Solution
Given
Mean = np = 5
(1)
SD =

(2)
(1)
p = 1

npq = 2
np
npq

=1
5 5

Sub (3) in (1) we get


n x 1/5 = 5

(2)
q= 4

5
p=

20

n = 25
The binomial distribution is
x n-x
= nCxp q
P(X = x) = p(x)
x

= 25Cx(1/5) (4/5)

n-x

x = 0, 1, 2, .., 25

1.7 Passion Distribution


Def :
A random variable X is said to follow if its probability law is given by
P(X = x)

ex
x!

= p(x) =

x = 0, 1, 2, ..,

Poisson distribution is a limiting case of binomial distribution under the following


conditions or assumptions.
1. The number of trails n should e infinitely large i.e. n.
2. The probability of successes p for each trail is infinitely small.
3. np = , should be finite where is a constant.
* To find MGF
tx
= E(e )
MX(t)

= e tx p(x)
x =0
=

x
etx e

=0

x!


t x
= e ( e )
x =0

x!

t x
= e ( e ) x =0 x!

Hence

t 2
e 1 + e t + ( e ) 2!

+ ......

t
(e 1)
= e e e t =(e et 1)
MX(t) = e

* To find Mean and Variance

M X '(t)

= e (e t 1)

M X (t)

WKT

= e (e t 1) .et
M X '(0)

' = E(X)
1

= e .

= x.p(x)
x =0

21

= x.

ex

x. e x 1

x!

x =0
+ e .

x!

x =0

x.x 1

=1

x!

= e .
x =1

x 1

(x 1)!

= e 1 + +

+ .....

2!
=e

.e

Mean =

'2

= E[X 2 ]

= x .p(x)

x =0

= {x(x 1)
x =0

+ x}.

x!

x =0

=e

x =0

x =0

= e 2

1+

=2+
Variance 2 = E(X 2 ) [E(X)]2

x!

x =0
x

x!

x.ex
x!
+

3)....1

x .

x 2

= e 2

x =0

x(x 1)e x

(x 2)(x

x 2

(x 2)!
2
+
+ ....
1! 2!

=2+2 =

Variance =
Hence Mean = Variance =
Note : * sum of independent Poisson Vairates is also Poisson variate.
PROBLEMS ON POISSON DISTRIBUTION
Example:1.7.1

If x is a Poisson variate such that P(X=1) = 10 and P(X=2) =

Solution
P(X = x)

ex
x!
22

1
5 , find the P(X=0) and P(X=3).

=e

P(X=1)
= e =

3
= 10

(Given)

3
10

(1)

e 2 =
P(X=2)

2!

(Given)

e 2 = 1
2! 3 5

(2)

(1) e = 10

(3)

(2) e 2 = 2

(4)

5
(3) 1 = 3
(4)
4
=4

e0

P(X=0)

P(X=3)

3
e 3 e 4/3 (4 / 3)
3!
= 3! =

0!

= e4/3

Example :1.7.2
If X is a Poisson variable
P(X = 2)
= 9 P(X = 4) + 90 P(X=6)
Find
(i) Mean if X
(ii) Variance of X
Solution
x

P(X=x) =
Given

x!

, x = 0,1, 2,.....
P(X = 2) = 9 P(X = 4) + 90 P(X=6)
e2

e4

e6

2! = 9 4! + 90 6!
1 9 2 904
+ 6!
=
2
4!
1 3 2 4
+ 8
2=
1 = 38 2 +4

4
23

4 + 3 2 4 = 0
2

=1

= -4
or
= 2i
= 1
or
Mean = = 1, Variance = = 1
Standard Deviation = 1

1.7.3 Derive probability mass function of Poisson distribution as a limiting case of Binomial
distribution
Solution
We know that the Binomial distribution is
x n-x
P(X=x) = nCx p q
=

P(X=x)

When n

n!
p x (1 p)n x
(n x)! x!

1.2.3.......(n x)(n x + 1)......np n (1 p)n


(1 p)x

1.2.3.....(n x) x!

p
= 1.2.3.......(n x)(n x +1)......n
1.2.3.....(n x) x!
1p
x
n(n 1)(n 2)......(n x + 1)
1
=

x!

n(n 1)(n 2)......(n x + 1)

1
11
=

x!

P(X=x)

x
n
n x

n x

n
lt 1 1

x! n

x1

n
x

...... 1

x1

x 1
...... 1 n

x!
2

1
11

x!

(1 p)n

1
n

1
lt

x! n

...... 1

lt

n n

...... lt

x 1
n
1

We know that

24

x 1
n

n x

lt 1
n

lt

and

n x

= e

n
1

lt

..... =

x 1

lt 1
n

P(X=x)


e , x = 0,1, 2,......
x!
1.8
GEOM Def: A discrete random variable X is said to
follow geometric distribution, if it assumes only
non-negative
values and
its probability
P
Where
q = 1-p mass
(

Example:1.8.1
To find MGF
M
= E[e
=
=

=1

p / q e tx qx

=p/q
t

(e

=1

p / q (e t q)x

Let x = e q

q)

+ (e

= p / q x + x 2+ x3

=1
t

q) 2 +(e t q) 3 + ....
+ ....

p x + x+ x2
p
= (1 x) 1
1
+ ....
q
q
p
t
t
t 1
= pe [1 qe ]
= q qe 1 qe

MX(t)

=1

t
= pe

1 qet
* To find the Mean & Variance
25

'

(1 qe t )pe t pe t ( qe t )

pet

M X (t) =
(1 qet )2
E(X) = M 'X (0) = 1/p

t 2
= (1 qe )

Mean = 1/p

pe
"X (t) = d
dt (1 qe )
t 2
t
t
t
t
(1

qe
)
pe

pe
2(1

qe
)(

qe
)
=
t 4
(1 qe )
t 2
t
t
qet (1 qet )
= (1 qe ) pe + 2pe
(1 qet )4
t

Variance

M" (0) =1+ q


p2
X
(1+ q)
2
2
2

p p2

Var (X) = E(X ) [E(X)] =

Var (X) = p2
Note:
Another form of geometric distribution
x
P[X=x] = q p ; x = 0, 1, 2, .

p
MX (t) = (1 qet )
Mean = q/p,

Variance = q/p

Example:1.8.2
-1
If the MGF of X is (5-4et) , find the distribution of X and P(X=5)
Solution
Let the geometric distribution be P(X
x
= x) = q p, x = 0, 1, 2, ..
The MGF of geometric distribution is given by

p
1 qet

(1)

t -1

5 1 1

Here MX(t) = (5 - 4e )

Company (1) & (2) we get


x

P(X = x) = pq ,

4
5

(2)

q = 4; p = 1
5
5
x = 0, 1, 2, 3, .
26

x
= 14 55

P(X = 5)

45

45
5

1.9 CONTINUOUS DISTRIBUTIONS


If X is a continuous random variable then we have the following distribution
1. Uniform (Rectangular Distribution)
2. Exponential Distribution
3. Gamma Distribution
4. Normal Distribution
1. 9.1 Uniform Distribution (Rectangular Distribution)
Def : A random variable X is set to follow uniform distribution if its

1
,a<x<b
ba

f (x) =

otherwise

0,

* To find MGF

= e txf (x)dx

M X (t)

= e

tx

1
dx
ba

1 etx a b
atb
1
bx
e
= (b a)t
e

at

The MGF of uniform distribution is


bt

at

M X (t) = e e
(b a)t
* To find Mean and Variance

= x f (x)dx

E(X)

= bx

dx =

ba

x
b

x dx =
a

ba
2

b
a

ba

2
= b a = b + a =a + b
2(b a)
2
2

27

' =a+b

Mean

Putting
'

r = 2 in (A), we get
b

=x

f (x)dx =

x2

ba

dx

2
2 + ab + b
a
=
3
'
= ' 2

Variance

=
Variance =

(b a)
12

b 2 + ab + b 2

b+a2

= (b a)2
12

PROBLEMS ON UNIFORM DISTRIBUTION


Example 1.9.1
If X is uniformly distributed over (-,), < 0, find so that
(i) P(X>1) = 1/3
(ii) P(|X| < 1) = P(|X| > 1)
Solution
If X is uniformly distributed in (-, ), then its p.d.f. is

f (x)

1
= 2

< x <

otherwise

(i) P(X>1) = 1/3

f (x)dx = 1 / 3
1

2 dx = 1 / 3
1
1 (x ) = 1 / 3

1 ( 1) = 1 / 3

=3
(ii) P(|X| < 1) = P(|X| > 1) = 1 - P(|X| < 1)
P(|X| < 1) + P(|X| < 1) = 1
2 P(|X| < 1) = 1 2 P(-1 <
X < 1) = 1
1

2 f (x)dx = 1
1

28

1 2/

2/

dx = 1
= 2

Note:
1. The distribution function F(x) is given by

F(x)

< x <

xa

axb

ba
1

b<x<

2. The p.d.f. of a uniform variate X in (-a, a) is given by

F(x)

1
= 2a

a<x<a
otherwise

1.10 THE EXPONENTIAL DISTRIBUTION


Def :A continuous random variable X is said to follow an exponential distribution with
parameter >0 if its probability density function is given by

F(x)

x>a
otherwise

To find MGF
Solution

M X (t)

= e txf (x)dx

= e tx e xdx
0

( t )x

= ( t) e
MGF of x =

= e ( t )xdx

= t

t , > t

* To find Mean and Variance


We know that MGF is

29


MX(t) =

1
=

= 1

t2
tr
t 1
= 1 + + 2 + ..... + r
t2

= 1+ + 2!
MX(t) =

t r t!

2! +..... +

r!

r =0

1
' = Coefficient of t = 1

Mean

t2
' = Coefficient of
2

2!

'

'2

Variance= =
2

1
Variance =

2
= 2

1!

2
1
1
2
2
= = 2

1
Mean =

Example: 1.10.1
Let X be a random variable with p.d.f
1

F(x)

3
0

e3

x>0
otherwise

Find
1) P(X > 3)
2) MGF of X
Solution
WKT the exponential distribution is
-x
F(x) = e , x > 0
Here = 1

e
P(x>3) = f (x) dx
3

P(X>3) = e
MGF is

= 3
3

-1

M X (t)

= t

dx

30

1
= 3
1
3 t
MX(t) =
Note

1
= 3
1 3t
3

1
1 3t

1
1 3t

If X is exponentially distributed, then


P(X > s+t / x > s)
= P(X > t), for any s, t > 0.
1.11 GAMMA DISTRIBUTION
Definition
A Continuous random variable X taking non-negative values is said to follow gamma
distribution , if its probability density function is given by
f(x) =

, >0, 0 < x <

=0, elsewhere
and

dx

=0, elsewhere
When is the parameter of the distribution.
Additive property of Gamma Variates
If X1,X2 , X3,.... Xk
are
independent gamma variates with parameters
1,2,.. krespectively then X1+X2 + X3+.... +Xk is also a gamma variates with parameter 1+ 2
+.. + k
.
Example :1.11.1
Customer demand for milk in a certain locality ,per month , is Known to be a
general Gamma RV.If the average demand is a liters and the most likely demand b liters (b<a) ,
what is the varience of the demand?
Solution :
Let X be represent the monthly Customer demand for milk.
Average demand is the value of E(X).
Most likely demand is the value of the mode of X or the value of X for which its density
function is maximum.
If f(x) is the its density function of X ,then
f(x) =

k-1 -x

.x

-x

, x>0

31

f(x) =

.[(k-1) x

k-2 -x

e-x ]

= 0 ,when x=0 , x=
.[(k-1) x
f (x) =
<0 , when x=
Therefour

k-2 -x

e-x ]

f(x) is maximum , when x=

i.e ,Most likely demand =


and E(X) =
Now V(X) = =
= a (a-b)

=b

.(1)

(2)
=
From (1) and (2)

TUTORIAL QUESTIONS
1.It is known that the probability of an item produced by a certain
machine will be defective is 0.05. If the produced items are sent to the
market in packets of 20, fine the no. of packets containing at least,
exactly and atmost 2 defective items in a consignment of 1000 packets
using (i) Binomial distribution (ii) Poisson approximation to binomial
distribution.
2. The daily consumption of milk in excess of 20,000 gallons is
approximately exponentially distributed with . 3000 = The city has a
daily stock of 35,000 gallons. What is the probability that of two days
selected at random, the stock is insufficient for both days.
3.The density function of a random variable X is given by f(x)= KX(2-X), 0X2.Find K, mean,
th
variance and r moment.
4.A binomial variable X satisfies the relation 9P(X=4)=P(X=2) when n=6. Find the parameter p
of the Binomial distribution.
5. Find the M.G.F for Poisson Distribution.
6. If X and Y are independent Poisson variates such that P(X=1)=P(X=2) and
P(Y=2)=P(Y=3). Find V(X-2Y).
7.A discrete random variable has the following probability distribution
X:
0
1
2
3
4
5
6
7
8
P(X) a
3a
5a
7a
9a
11a
13a
15a
17a
Find the value of a, P(X<3) and c.d.f of X.
32

7. In a component manufacturing industry, there is a small probability of 1/500 for any


component to be defective. The components are supplied in packets of 10. Use Poisson
distribution to calculate the approximate number of packets containing (1). No defective. (2).
Two defective components in a consignment of 10,000 packets.

WORKED OUT EXAMPLES


Example :1
Given the p.d.f. of a continuous random variable X follows
f (x) =

6x(1 x),

0<x<1

otherwise

Solution
Given f (x) =

, find c.d.f. for X

6x(1 x),

0<x<1

otherwise

The c.d.f is F(x) = f (x) dx , < x <

(i) When x < 0, then


x

F(x) =

f (x) dx

= 0 dx

=0

(ii) When 0< x < 1, then


x

F(x)

= f (x) dx

0x

= f (x) dx + f (x) dx

0
x

30

= 0 + 6x(1 x) dx = 6 x(1 x) dx = 6 x x
0

(iii)

= 3x 2x
When x > 1, then

F(x)

= f (x) dx

= 0dx + 6x(1 x) dx + 0 dx

6 (x x 2 ) dx

=1

33

Using (1), (2) & (3) we get

0,

F(x)
3x 2 2x 3 ,
=

x<0
0<x<1
x>1

1,

Example :2
A random variable X has the following probability function
Values of X
0 1 2 3 4 5
6
Probability P(X)

a 3a 5a 7a 9a 11a 13a 15a 17a

(i) Determine the value of a


(ii) Find P(X<3), P(X3), P(0<X<5)
(iii)
Find the distribution function of X.
Solution
Table 1
Values of X
p(x)

0 1 2 3 4 5
a 3a 5a 7a 9a 11a

6
7
8
13a 15a 17a

(i) We know that if p(x) is the probability of mass function then


8

p(x i ) = 1
i =0

p(0) + p(1) + p(2) + p(3) + p(4) + p(5) + p(6) + p(7) + p(8) = 1


a + 3a + 5a + 7a + 9a + 11a + 13a + 15a + 17a
=
1
81 a =
1
a
=
1/81
put a = 1/81 in table 1, e get table 2
Table 2
X=x 0
P(x)

1/81 3/81 5/81 7/81 9/81 11/81 13/81 15/81 17/81

(ii) P(X < 3)

= p(0) + p(1) + p(2)


= 1/81+ 3/81 + 5/81 = 9/81
(ii) P(X 3)
= 1 - p(X < 3)
= 1 - 9/81
= 72/81
(iii) P(0 < x < 5)
= p(1) + p(2) + p(3) + p(4) here 0 & 5 are not include
= 3/81 + 5/81 + 7/81 + 9/81
3+5+7+8+9
24
=
=

81
81
(iv) To find the distribution function of X using table 2, we get
34

X=x

F(X) = P(x x)

F(0)

= p(0) = 1/81

F(1)
= P(X 1) = p(0) + p(1)
= 1/81 + 3/81 = 4/81

F(2)
= P(X 2) = p(0) + p(1) + p(2)
= 4/81 + 5/81 = 9/81

F(3)
= P(X 3) = p(0) + p(1) + p(2) + p(3)
= 9/81 + 7/81 = 16/81

F(4)
= P(X 4) = p(0) + p(1) + . + p(4)
= 16/81 + 9/81 = 25/81

F(5)
= P(X 5) = p(0) + p(1) + .. + p(4) + p(5)
= 2/81 + 11/81 = 36/81

F(6)
= P(X 6) = p(0) + p(1) + .. + p(6)
= 36/81 + 13/81 = 49/81

F(7)
= P(X 7) = p(0) + p(1) + . + p(6) + p(7)
= 49/81 + 15/81 = 64/81

F(8)
= P(X 8) = p(0) + p(1) + .. + p(6) + p(7) + p(8)
= 64/81 + 17/81 = 81/81 = 1

Example :3
The mean and SD of a binomial distribution are 5 and 2, determine the distribution.
Solution
Given
Mean = np = 5
(1)
SD =

(2)
(1)
p = 1

4
5

npq = 2
np = 4 q = 4
npq 5
5

=1
5

p=

(2)

1
5

Sub (3) in (1) we get n x 1/5 = 5


n = 25
x
The binomial distribution is P(X = x) = p(x) = nCxp
n-x
q
x
n-x
= 25Cx(1/5) (4/5) ,
Example :4
If X is a Poisson variable

x = 0, 1, 2, .., 25

35

P(X = 2)
Find
Solution
P(X=x) =
Given

= 9 P(X = 4) + 90 P(X=6)
(i) Mean if X
(ii) Variance of X

ex , x = 0,1, 2,.....
x!

P(X = 2) = 9 P(X = 4) + 90 P(X=6)


e2

e4

2! = 9 4! + 90
1 9 2 904
+
=
2
4!
6!
2
4
1 3
+
=
2
8
8
2
3
4
+
1=
4
4
4 + 3 2 4 = 0
2

=1

e6

6!

= -4
or
= 2i
= 1
or
Mean = = 1, Variance = = 1
Standard Deviation = 1

36

UNIT II
TWO DIMENSIONAL RANDOM VARIABLES
Introduction

In the previous chapter we studied various aspects of the theory of a single R.V. In this
chapter we extend our theory to include two R.V's one for each coordinator axis X and Y
of the XY Plane.
DEFINITION : Let S be the sample space. Let X = X(S) & Y = Y(S) be two functions each
assigning a real number to each outcome s S. hen (X, Y) is a two dimensional random
variable.
2.1 Types of random variables
1. Discrete R.V.s
2. Continuous R.V.s
Discrete R.V.s (Two Dimensional Discrete R.V.s)
If the possible values of (X, Y) are finite, then (X, Y) is called a two dimensional discrete
R.V. and it can be represented by (xi, y), i = 1,2,.,m.
In the study of two dimensional discrete R.V.s we have the following 5 important terms.

Joint Probability Function (JPF) (or) Joint Probability Mass Function.


Joint Probability Distribution.
Marginal Probability Function of X.
Marginal Probability Function of Y.
Conditional Probability Function.

2.1.1 Joint Probability Function of discrete R.V.s X and Y


The function P(X = xi, Y = yj) = P(xi, yj) is called the joint probability function for
discrete random variable X and Y is denote by pij.
Note
1. P(X = xi, Y = yj) = P[(X = xi)(Y = yj)] = pij
2. It should satisfies the following conditions
(i) pij i, j
(ii) ji pij = 1
2.1.2 Marginal Probability Function of X
If the joint probability distribution of two random variables X and Y is given then the
marginal probability function of X is given by
Px(xi) = pi
(marginal probability function of Y)
Conditional Probabilities
The conditional probabilities function of X given Y = yj is given by
P[X = xi / Y = yj]
pij
P[X = xi / Y = yj] = =
P[Y = yj]
p .j
37

The set {xi, pij / p.j}, i = 1, 2, 3, ..is called the conditional probability distribution of X
given Y = yj.
The conditional probability function of Y given X = xi is given by
P[Y = yi / X = xj] p ij P[Y =
yi / X = xj] = =
P[X = xj]
pi.
The set {yi, pij / pi.}, j = 1, 2, 3, ..is called the conditional probability distribution of Y
given X = xi.
SOLVED PROBLEMS ON MARGINAL DISTRIBUTION
Example:2.1.1
From the following joint distribution of X and Y find the marginal distributions.
X
0
1
2
Y
0
3/28
9/28
3/28
1
3/14
3/14
0
2
1/28
0
0
Solution
X
0
2
PY(y) = p(Y=y)
Y
0
3/28 P(0,0)
3/28 P(2,0) 15/28 = Py(0)
1
3/14 P(0, 1)
3/14 P(1,1) 6/14 = Py(1)
2
1/28 P(0,2)
0 P(2,2)
1/28 = Py(2)
10/28 = 5/14 3/28
PX(X) = P(X=x)
1
PX(0)
PX(2)
The marginal distribution of X
PX(0) = P(X = 0) = p(0,0) + p(0,1) + p(0,2) = 5/14
PX(1) = P(X = 1) = p(1,0) + p(1,1) + p(1,2) = 15/28
PX(2) = P(X = 2) = p(2,0) + p(2,1) + p(2,2) = 3/28
Marginal probability function of X is

PX (x) =

14 , x = 0
15
, x=1
28
3
, x=2
28

The marginal distribution of Y


PY(0) = P(Y = 0) = p(0,0) + p(1,0) + p(2,0) = 15/28
PY(1) = P(Y = 1) = p(0,1) + p(2,1) + p(1,1) = 3/7
38

PY(2) = P(Y = 2) = p(0,2) + p(1,2) + p(2,2) = 1/28


Marginal probability function of Y is

15
28 , y = 0
3
y=1
= ,
7
1
, y=2
28

PY (y)

2.3 CONTINUOUS RANDOM VARIABLES


Two dimensional continuous R.V.s
If (X, Y) can take all the values in a region R in the XY plans then (X, Y) is called twodimensional continuous random variable.
Joint probability density function :

(i) fXY(x,y) 0 ; (ii) f XY (x, y) dydx =1


Joint
probability
distribution
function F(x,y) = P[X x, Y y]
=

xy

f (x, y)dx dy

Marginal probability density function

f(x) = fX(x) = f x,y (x, y)dy (Marginal pdf of X)

f(y) = fY(x) = f x,y (x, y)dy (Marginal pdf of Y)

Conditional probability density function


(i) P(Y = y / X = x) = f (y / x)

= f (x, y) , f (x) > 0

f (x)
(ii) P(X = x / Y = y) = f (x / y) =f (x, y) , f (y) > 0
f (y)
Example :2.3.1
Show

that

the

function

is a joint density function of X and Y.


Solution

2
(2x + 3y), 0 < x < 1,
f (x, y) = 5

0
39

otherwise

0<y<1

We know that if f(x,y) satisfies the conditions


(i) f (x, y) 0

(ii) f (x, y) = 1, then f(x,y) is a jdf


2
Given

(2x + 3y), 0 < x < 1,

f (x, y) = 5

0<y<1

otherwise

(i) f (x, y) 0 in the given interval 0 (x,y) 1

11

00

(ii) f (x, y) dx dy = (2x + 3y) dx dy


=

=
=

11

500
21

+ 3xy

dy
0

2
(1 + 3y) dy = 5 y +

3y 2

50

2
=

3
1+ 2

2 5 =1

5 2

Since f(x,y) satisfies the two conditions it is a j.d.f.


Example :2.3.2
The j.d.f of the random variables X and Y is given
f (x, y) =

8xy,

0 < x < 1,

0,

otherwise

Find (i) fX(x)


(ii) fY(y)
Solution
We know that
(i) The marginal pdf of X is

0<y<x
(iii) f(y/x)

fX(x) = f(x) = f (x, y)dy = 8xy dy = 4x

f (x) = 4x , 0 < x < 1


(ii) The marginal pdf of Y is

fY(y) = f(y) = f (x, y)dy = 8xy dy = 4y

f (y) = 4y, 0 < y <


(iii) We know that

f ( y / x)

= f (x, y)
f (x)
= 8xy = 2y , 0 < y < x, 0 < x < 1

4x 3

x2
40

Result
Marginal pdf g

Marginal pdf y

F(y/x)

2y

4x , 0<x<1

x2 ,0 < y < x, 0 < x < 1

4y, 0<y<x

2.4 REGRESSION
* Line of regression
The line of regression of X on Y is given by

x x = r. x (y y)
The line of regression of Y on X is given by

y y = r. x (x x)
* Angle between two lines of Regression.

1 r2 y x
2+
tan = r
x

* Regression coefficient
Regression coefficients of Y on X

r. x = bYX
Regression coefficient of X and Y

r.

x
y = bXY

Correlation coefficient r =

b XY bYX

Example:2.4.1
1. From the following data, find
(i) The two regression equation
(ii) The coefficient of correlation between the marks in Economic and Statistics.
(iii) The most likely marks in statistics when marks in Economic are 30.
Marks in Economics
Marks in Statistics
Solution

25
40

28
46

35 32
49 41

31
36

36
32

29
31

38
30

34
33

32
39

41

MA6451
PROCESSES

X
25
28
35
32
31
36
29
38
34
32
320

Y
43
46
4
41
36
32
31
30
33
39
380

Here

PROBABILITY AND RANDOM

X X = X 32 X Y = Y 38

(X X)2

(Y Y ) (X X ) (Y Y )

-7
-4
3
0
-1
4
-3
6
2
0
0

49
16
9
0
1
16
09
36
4
0
140

25
64
121
9
4
36
49
64
25
1
398

5
8
11
3
-2
-6
-7
-8
-5
1
0

= X = 320 = 32 and
n
10

= Y = 380 = 38
n
10

Coefficient of regression of Y on X is

93
(X X)(Y Y) =
bYX =

(X X)

140

Coefficient of regression of X on Y is

93
(X X)(Y Y) =
bXY =

(Y Y)2

398

Equation of the line of regression of X and Y is

= b XY (Y Y)

X X

X 32
= -0.2337 (y 38)
X
= -0.2337 y + 0.2337 x 38 + 32
X
= -0.2337 y + 40.8806
Equation of the line of regression of Y on X is

= b YX (X X)

YY
Y 38
Y

= -0.6643 (x 32)
= -0.6643 x + 38 + 0.6643 x 32
= -0.6642 x + 59.2576
Coefficient of Correlation
2
r
= bYX bXY
= -0.6643 x (-0.2337)
r
= 0.1552
r
r

=
=

0.1552
0.394

-35
-32
33
0
2
-24
+21
-48
-48
100
-93

Now we have to find the most likely mark, in statistics (Y) when marks in economics (X) are
30. y = -0.6643 x + 59.2575
42

Put x = 30, we get


y = -0.6643 x 30 + 59.2536
= 39.3286
y~ 39
2.5 COVARIANCE
Def : If X and Y are random variables, then Covariance between X and Y is defined as
Cov (X, Y)
= E(XY) E(X) . E(Y)
Cov (X, Y)
=0
[If X & Y are independent]

2.6 CORRELATION
Types of Correlation
Positive Correlation
(If two variables deviate in same direction)
Negative Correlation
(If two variables constantly deviate in opposite direction)
2.7 KARL-PEARSONS COEFFICIENT OF CORRELATION
Correlation coefficient between two random variables X and Y usually denoted by r(X,
Y) is a numerical measure of linear relationship between them and is defined as

= Cov(X, Y) ,

r(X, Y)

X .Y

Where Cov (X, Y) =

X=X;
n

1
n

XY X Y

Y=

Y
n

* Limits of correlation coefficient


-1 r 1.
X & Y independent, r(X, Y) = 0.
Note :Types of correlation based on r.
Values of r
Correlation is said to be
r=1
perfect and positive
0<r<1
positive
-1<r<0
negative
r=0
Uncorrelated
SOLVED PROBLEMS ON CORRELATION
Example :2.6.1
Calculated the correlation coefficient for the following heights of fathers X and their sons
Y.
X
65
66
67
67
68
69
70
72
Y
67
68
65
68
72
72
69
71
43

Solution
X
65
66
67
67
68
69
70
72

Y
67
68
65
68
72
72
69
71

U = X 68
-3
-2
-1
-1
0
1
2
4

U=0

Now

V = Y 68
-1
0
-3
0
4
4
1
3

U
9
4
1
1
0
1
4
16

UV
3
0
3
0
0
4
2
12

V
1
0
9
0
16
16
1
9

UV = 24 U 2 = 36 V 2 = 52

V=0

U 0
n =8 =0
V 8
V= n = 8=1
U=

Cov (X, Y) = Cov (U, V)

UV
n

U V = 24 0 = 3
8

(1)

U = U 2 U 2 =
n

36 0 = 2.121
8

(2)

V = V 2 V 2 =
n

52 1 = 2.345
8

(3)

= r(U, V) =

r(X, Y)

Cov(U, V)

U .V
= 0.6031
Example :2.6.2
Let X

3
2.121 x 2.345

(by 1, 2, 3)
p.d.f. f (x) =

be

random

variable with

Y = x , find the correlation coefficient between X and Y.


Solution

44

, 1 x 1 and let

E(X)

= x.f (x) dx

1x

= x. dx
1 2

E(X)

=0

E(Y)

= x 2.f (x) dx

11

22

22

1
=0

=x

dx =

2
2

1x

23

=
1

11

23

1=

1 2

2 3

1
3

E(XY) = E(XX )
3

= E(X )

x .f (x) dx =

r(X, Y) = (X, Y) =

Cov(X, Y)

=0

4 1

E(XY) = 0

=0

X Y

= 0.
Note : E(X) and E(XY) are equal to zero, noted not find x&y.
2.8 TRANSFORMS OF TWO DIMENSIONAL RANDOM VARIABLE
Formula:

f U (u) = f u,v (u, v) dv

f V (u) = f u,v (u, v) du

&

f UV (u, V)

= f XY (x, y) (x, y)

(u, v)
Example : 1
If the joint pdf of (X, Y) is given by fxy(x, y) = x+y, 0 x, y 1, find the pdf of = XY.
Solution
Given
fxy(x, y) = x + y
Given
U = XY
Let
V=Y

x= v&y=V
45
x
1 x
u
y
y
u = V . v = V 2 ; u = 0; v = 1

y x

1
v = V
y
0
v

J = (x, y) =u

(u, v)
|J|=

y
u

(1)

u
V2

=11=1

1
V

The joint p.d.f. (u, v) is given by

(2)

= f xy (x, y) |J|
1
= (x + y) | v |

f uv (u, v)

1u
=V
+u
v

The range of V :
Since 0 y 1, we have 0 V 1
The range of u :
Given
0x1
0

(3)
( V = y)

0uv
Hence the p.d.f. of (u, v) is given by

fuv (u,v)

1u

+ v , 0 u v, 0 v 1

v v

Now

f U (u) = f u,v (u, v) dv


1

= f u,v (u, v) dv
u
1

+1 dv

v 1 1
= v + u. 1
u

fu(u) = 2(1-u), 0 < u < 1


p.d.f of (u, v)

p.d.f of u = XY

46

f uv (u, v) =

1u

fu(u) = 2(1-u), 0 < u < 1

+v

v v
0 u v, 0 v 1

TUTORIAL QUESTIONS
1. The jpdf of r.v X and Y is given by f(x,y)=3(x+y),0<x<1,0<y<1,x+y<1 and 0 otherwise.
Find the marginal pdf of X and Y and ii) Cov(X,Y).
2. Obtain the correlation coefficient for the following data:
X: 68
64
75
50
64
80
75
40
55
64
Y: 62
58
68
45
81
60
48
48
50
70
3.The two lines of regression are 8X-10Y+66=0, 4X-18Y-214=0.The variance of x is 9 find i)
The mean value of x and y. ii) Correlation coefficient between x and y.
4. If X1,X2,Xn are Poisson variates with parameter =2, use the central limit theorem to
find P(120Sn160) where Sn=X1+X2+Xn and n=75.
5. If the joint probability density function of a two dimensional random variable (X,Y) is
2
given by f(x, y) = x + , 0<x<1,0<y<2= 0, elsewhere Find (i) P(X>1/2)(ii) P(Y<X) and
(iii)
P(Y<1/2/ X<1/2).
6. Two random variables X and Y have joint density
Find Cov (X,Y).
7. If the equations of the two lines of regression of y on x and x on y are
respectively 7x-16y+9=0; 5y-4x-3=0, calculate the coefficient of correlation.
WORKEDOUT EXAMPLES
Example 1
The j.d.f of the random variables X and Y is given
f (x, y) =

8xy,

0 < x < 1,

0,

otherwise

Find (i) fX(x)


(ii) fY(y)
Solution
We know that
(i) The marginal pdf of X is

0<y<x
(iii) f(y/x)

fX(x) = f(x) = f (x, y)dy = 8xy dy = 4x

f (x) = 4x , 0 < x < 1


(ii) The marginal pdf of Y is

fY(y) = f(y) = f (x, y)dy = 8xy dy = 4y

f (y) = 4y, 0 < y <


47

(iii) We know that

f ( y / x)

= f (x, y)
f (x)
= 8xy = 2y , 0 < y < x, 0 < x < 1
4x 3

Example 2
Let X be

x2

random

1 , 1 x 1 and let

variable with p.d.f. f (x) =

Y = x , find the correlation coefficient between X and Y.


Solution

= x.f (x) dx

E(X)

1
= x. 2 dx
1

x
1
= 2
2

E(X)

=0

E(Y)

= x 2.f (x) dx

1
1
1
=
=

0
22
2

3
2

=x .
1

1
2

dx

1x
23

=
1

11

23

1
3

1. 2

2 3

1
3

E(XY) = E(XX )
3

= E(X )

x .f (x) dx =

E(XY) = 0

x
4

=0
1

Cov(X, Y) = 0

r(X, Y) = (X, Y) =

X Y

= 0.
Note : E(X) and E(XY) are equal to zero, noted not find x&y.
Result
Marginal pdf g

Marginal pdf y

2y

4x , 0<x<1

F(y/x)

4y, 0<y<x

x2 ,0 < y < x, 0 < x < 1

48

UNIT - III
RANDOM PROCESSES
Introduction

In chapter 1, we discussed about random variables. Random variable is a function of the


possible outcomes of a experiment. But, it does not include the concept of time. In the
real situations, we come across so many time varying functions which are random in
nature. In electrical and electronics engineering, we studied about signals.
Generally, signals are classified into two types.
(i) Deterministic
(ii) Random
Here both deterministic and random signals are functions of time. Hence it is
possible for us to determine the value of a signal at any given time. But this is not
possible in the case of a random signal, since uncertainty of some element is always
associated with it. The probability model used for characterizing a random signal is called
a random process or stochastic process.
3.1 RANDOM PROCESS CONCEPT
A random process is a collection (ensemble) of real variable {X(s, t)} that are functions
of a real variable t where s S, S is the sample space and t T. (T is an index set).
REMARK
i)If t is fixed, then {X(s, t)} is a random variable.
ii) If S and t are fixed {X(s, t)} is a number.
iii) If S is fixed, {X(s, t)} is a signal time function.
NOTATION
Here after we denote the random process {X(s, t)} by {X(t)} where the index set T is assumed to
be continuous process is denoted by {X(n)} or {Xn}.
A comparison between random variable and random process

Random Variable
A function of the possible outcomes of
an experiment is X(s)
Outcome is mapped into a number x.

Random Process
A function of the possible outcomes of
an experiment and also time i.e, X(s, t)
Outcomes are mapped into wave from
which is a fun of time 't'.

49

3.2 CLASSIFICATION OF RANDOM PROCESSES


We can classify the random process according to the characteristics of time t and the
random variable X = X(t) t & x have values in the ranges
< t < and < x <.

Random Process is a function of

Random Variables

Discrete

Time t

Continuous

Discrete

Continuous

3.2.1 CONTINUOUS RANDOM PROCESS


If 'S' is continuous and t takes any value, then X(t) is a continuous random variable.
Example
Let X(t) = Maximum temperature of a particular place in (0, t). Here 'S' is a continuous
set and t 0 (takes all values), {X(t)} is a continuous random process.
3.2.2 DISCRETE RANDOM PROCESS
If 'S' assumes only discrete values and t is continuous then we call such random process
{X(t) as Discrete Random Process.
Example
Let X(t) be the number of telephone calls received in the interval (0, t).
Here, S = {1, 2, 3, }
T = {t, t 0}
{X(t)} is a discrete random process.
3.2.3 CONTINUOUS RANDOM SEQUENCE
If 'S' is a continuous but time 't' takes only discrete is called discrete random
th
sequence. Example: Let Xn denote the outcome of the n toss of a fair die.
Here S = {1, 2, 3, 4, 5, 6}
T = {1, 2, 3, }
(Xn, n = 1, 2, 3, } is a discrete random sequence.
50

3.3 CLASSIFICATION OF RANDOM


PROCESSES BASED ON ITS SAMPLE
FUNCTIONS NonDeterministic Process
A Process is called non-deterministic process if the future values of any sample function
cannot be predicted exactly from observed values.
Deterministic Process
A process is called deterministic if future value of any sample function can be predicted
from past values.
3.3.1 STATIONARY PROCESS
A random process is said to be stationary if its mean, variance, moments etc are constant.
Other processes are called non stationary.
st

1. 1 Order Distribution Function of {X(t)}


For a specific t, X(t) is a random variable as it was observed earlier.
F(x, t) = P{X(t) x} is called the first order distribution of the process {X(t)}.
st

1 Order Density Function of {X(t)}

f (x, t ) = x F (x, t)is called the first order density of {X, t}


nd

Order distribution function of {X(t)}

F (x 1 , x 2 ; t 1 , t 2 ) = P {X (t 1 ) x 1 ; X (t 2 ) x2 }is the point distribution of the random


variables X(t1) and X(t2) and is called the second order distribution of the process {X(t)}.
nd
2 order density function of {X(T)}
f (x 1 , x 2 ; t 1 , t2 ) =

2F (x 1 , x 2 ; t 1 , t2 )

x, x2

is called the second order density of {X(t)}.

3.3.2 First - Order Stationary Process


Definition
st
A random process is called stationary to order, one or first order stationary if its 1 order
density function does not change with a shift in time origin.
In other words,

f X (x 1 , t 1 ) = f X (x 1 , t 1 + C)must be true for any t1 and any real number C if


{X(t1)} is to be a first order stationary process.
Example :3.3.1
Show that a first order stationary process has a constant mean.
Solution
Let us consider a random process {X(t1)} at two different times t1 and t2.
51

E X (t

) = xf (x, t 1

)dx

[f(x,t1) is the density form of the random process X(t1)]


E X (t

)=

xf (x, t 2

)dx

[f(x,t2) is the density form of the random process X(t2)]


Let t2 = t1 + C

E X (t

)=

xf (x, t 1 + C )dx =

=EX
Thus E X

2)

=EX t
(

xf (x, t )dx
1

)
1)

Mean process {X(t1)} = mean of the random process {X(t2)}.


Definition 2:
If the process is first order
stationary, then Mean =
E(X(t)] = constant
3.3.4 Second Order Stationary Process
A random process is said to be second order stationary, if the second order
density function stationary.

f (x 1 , x 2 ; t 1 , t 2 ) = f (x 1 , x 2 ; t 1 + C, t 2 + C )x 1 , x2 and C.
E (X12 ), E (X 22 ), E (X1 , X2 )denote change with time, where
X = X(t1); X2 = X(t2).
3.3.5 Strongly Stationary Process
A random process is called a strongly stationary process or Strict Sense
Stationary
Process (SSS Process) if all its finite dimensional distribution are invariance
under translation of time 't'.
fX(x1, x2; t1, t2) = fX(x1, x2; t1+C, t2+C)
fX(x1, x2, x3; t1, t2, t3) = fX(x1, x2, x3; t1+C,
t2+C, t3+C) In general
fX(x1, x2..xn; t1, t2tn) = fX(x1, x2..xn; t1+C, t2+C..tn+C) for any t1 and any
real number
C.
3.3.6 Jointly - Stationary in the Strict Sense
{X(t)} and Y{(t)} are said to be jointly stationary in the strict sense, if the
joint distribution of X(t) and Y(t) are invariant under translation of time.
Definition Mean:
X

t =EX t

, < t <

X (t) is also called mean function or ensemble average of the random process.

3.3.7 Auto Correlation of a Random Process

52

Let X(t1) and X(t2) be the two given numbers of the random process {X(t)}. The auto
correlation is

R XX (t 1 , t 2 ) = E {X (t 1 )xX (t2 )}

Mean Square Value


Putting t1 = t2 = t in (1), we get
RXX (t,t)
= E[X(t) X(t)]
2
R (t, t ) = E X

(t) is the mean square value of the random process.

AX

3.3.8 Auto Covariance of A Random Process

C XX (t 1 , t 2 ) = E
=R

{ X (t

XX ( 1

t ,t

) E (X (t 1 )) } X (t 2 ) E (X (t2 ))

) E X

t EX
(1 )

t
(

2)

Correlation Coefficient
The correlation coefficient of the random process {X(t)} is defined as

XX (t 1 , t2 ) =

C XX (t 1 , t2 )
Var X (t 1 )xVar X (t2 )

Where CXX (t1, t2) denotes the auto covariance.


3.4 CROSS CORRELATION
The cross correlation of the two random process {X(t)} and {Y(t)} is defined by
RXY (t1, t2) = E[X(t1) Y (t2)]
3.5 WIDE - SENSE STATIONARY (WSS)
A random process {X(t)} is called a weakly stationary process or covariance stationary
process or wide-sense stationary process if
i)E{X(t)} = Constant
ii) E[X(t) X(t+] = RXX() depend only on when = t2 - t1.
REMARKS :
BS Process of order two is a WSS Process and not conversely.
3.6 EVOLUTIONARY PROCESS
A random process that is not stationary in any sense is called as evolutionary process.
SOLVED PROBLEMS ON WIDE SENSE STATIONARY PROCESS
Example:3.6.1
Given an example of stationary random process and justify your claim.
Solution:
Let us consider a random process X(t) = A as (wt + ) where A & are custom and '' is
uniformlydistribution random Variable in the interval (0, 2).
Since '' is uniformly distributed in (0, 2), we have
53

1
f ( ) = 2

,0 < C < 2

0 ,otherwise

E[X(t)]

X (t )f ( ) d

= 2 A ( t + ) 2 d
0

2
= A sin ( t + )

0
2
= A Sin (2 + t ) Sin (t + 0)
2

A
2 [Sin t sin t]

= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
Example:3.6.2 which are not stationary
Examine whether the Poisson process {X(t)} given by the probability law P{X(t)=n] =
e t (t) , n = 0, 1, 2,
. n
Solution
We know that the mean is given by

E X (t ) =

nP (t)
n

n =0

ne

n =0
=

(t)

n 1

= et
=e

n =1

( t) n

(t)n

n 1n=1
t (t)
+
0!
1!
2

+ ...

54

= ( t )e t 1 + t + (t) + ...
1

= (t )e t et
= t , depends on t
Hence Poisson process is not a stationary process.
3.7 ERGODIC RANDOM PROCESS
Time Average
The time average of a random process {X(t)} is defined as
X

1
=
2T

T X (t )dt
T

Ensemble Average
The ensemble average of a random process {X(t)} is the expected value of the random
variable X at time t
Ensemble Average = E[X(t)]
Ergodic Random Process
{X(t)} is said to be mean Ergodic
If lim X T =

lim

T X (t )dt =

2T T

Mean Ergodic Theorem


Let {X(t)} be a random process with constant mean and let XT be its time average.
Then {X(t)} is mean ergodic if

lim Var X T = 0
T

Correlation Ergodic Process


The stationary process {X(t)} is said to be correlation ergodic if the process {Y(t)} is
mean ergodic where
Y(t) = X(t) X(t+)
E y (t ) = lim Y when Y is the time average of Y(t).
|T|

3.8 MARKOV PROCESS


Definition
A random process {X(t)} is said to be markovian if

P X (t

n +1

n +1

PX t

( n +1

/ X (n ) +
x

n +1

, x (t n 1

=
/ x tn) x

Where t 0 t 1 t 2 ... t n tn +1
Examples of Markov Process

)=

=
x n 1 ...x (t 0 x

55

1.The probability of raining today depends only on previous weather conditions existed
for the last two days and not on past weather conditions.
2.A different equation is markovian.
Classification of Markov Process

Markov Process

Continuous
Parameter
Markov Process

Discrete
Parameter
Markov Process

Discrete
Parameter
Markov Chain

Continuous
Parameter
Markov Chain

3.9 MARKOV CHAIN


Definition
We define the Markov Chain as follows
If P{Xn = an/Xn-1 = an-1, Xn-2 = an-2, X0 = a0}
P{Xn = an / Xn-1 = an-1} for all n. the process {Xn}, n = 0, 1, 2 is called as Markov
Chains.
1.a1, a2, a3, an are called the states of the Markov Chain.
2.The conditional probability P{X n = aj | X n 1 = ai} = Pij (n 1, n) is called the one step
th

transition probability from state a i to state a j at the n


step. 3.The tmp of a Markov chain is a stochastic matricx
i) Pij 0
ii) Pij = 1 [Sum of elements of any row is 1]
3.10 Poisson Process
The Poisson Process is a continuous parameter discrete state process which is very useful
model for many practical situations. It describe number of times occurred. When an experiment
is conducted as a function of time.
Property Law for the Poisson Process
Let be the rate of occurrences or number of occurrences per unit time and P n(t) be the
probability of n occurrences of the event in the interval (0, t) is a Poisson distribution with
parameter t.

()

i.e. P X t = n =

P (t) = e
n

(t)n
n!

et (t)n , n = 0,1,2,...
n!
56

Second Order Probability Function of a Homogeneous Poisson Process

P X (t

) = n1 =

X (t 2 ) = n = P X (t

(
t

=n2

)/

>t

X (t ) = n , t

= n .P [the even occurs n2-n times in the interval (t2=t1)

=PX

) .P X (t

=n

(t1 )n

t t

{ ( t 2 t1 )}n

=n1

, n 2 n1
=

n1

n 2 n1

e t 2 . n 2 .t 1n1 (t 2 t1 )n 2 n1

n,

n1

, n2

, otherwise

3.11SEMI RANDOM TELEGRAPH SIGNAL PROCESS


N(t)
If N(t) represents the number of occurrence of a specified event in (0, t) and X(t) = () ,
then {X(t)} is called a semi-random telegraph signal process.
3.11.1 RANDOM TELEGRAPH SIGNAL PROCESS
Definition
A random telegraph process is a discrete random process X(t) satisfying the following:
i. X(t) assumes only one of the two possible values 1 or 1 at any time 't'
ii. X(0) = 1 or 1 with equal probability 1/2
iii. The number of occurrence N(t) from one value to another occurring in any interval of
length 't' is a Poisson process with rate , so that the probability of exactly 'r' transitions is
t
r
P N (t ) = r = e (t) , r = 0,1, 2,...
r!

A typical sample function of telegraph process.

(0,1)

(0,-1)
Note: The process is an example for a discrete random process.
* Mean and Auto Correlation P{X(t) = 1} and P{X(t) = 1" for any t.
57

TUTORIAL QUESTIONS
1.. The t.p.m of a Marko cain with three states 0,1,2 is P=
and the initial state distribution is
Find (i)P[X2=3] ii)P[X3=2, X2=3, X1=3, X0=2]
2. Three boys A, B, C are throwing a ball each other. A always throws the ball to B and B always
throws the ball to C, but C is just as likely to throw the ball to B as to A. S.T. the process is
Markovian. Find the transition matrix and classify the states
3. A housewife buys 3 kinds of cereals A, B, C. She never buys the same cereal in successive
weeks. If she buys cereal A, the next week she buys cereal B. However if she buys P or C the
next week she is 3 times as likely to buy A as the other cereal. How often she buys each of the
cereals?
4. A man either drives a car or catches a train to go to office each day. He never goes 2 days in a
row by train but if he drives one day, then the next day he is just as likely to drive again as he is
to travel by train. Now suppose that on the first day of week, the man tossed a fair die and drove
rd
to work if a 6 appeared. Find 1) the probability that he takes a train on the 3 day. 2). The
probability that he drives to work in the long run.
WORKED OUT EXAMPLES
th

Example:1.Let Xn denote the outcome of the n toss of a fair die.


Here S = {1, 2, 3, 4, 5, 6}
T = {1, 2, 3, }
(Xn, n = 1, 2, 3, } is a discrete random sequence.
Example:2 Given an example of stationary random process and justify your claim.
Solution:
Let us consider a random process X(t) = A as (wt + ) where A & are custom and '' is
uniformly distribution random Variable in the interval
(0, 2).
Since '' is uniformly distributed in (0, 2), we have

,0 < C <
2 f ( ) = 2
0 ,otherwise
E[X(t)]

= X (t )f ( ) d

58

1
d
2

=2 A(t+)
0

= A sin ( t + ) 2

0
2
= A Sin (2 + t ) Sin (t + 0)
2

A
2 [Sin t sin t]

= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
Example:3.which are not stationary .Examine whether the Poisson process {X(t)} given by the

(t)

probability law P{X(t)=n] = t


, n = 0, 1, 2, .
n
Solution
We know that the mean is given by

E X (t ) =

nP (t)
n

n =0

= ne

( t) n
n

n =0
=e

(t)

n 1

n =1

= et

(t)n

n 1n=1

( t )
= e t t +
0!

= ( t )e t 1+

2
+ ...

1!

t
1

(t) 2
+

= (t )e t et
= t , depends on t
Hence Poisson process is not a stationary process.

59

+ ...

UNIT - 4
CORRELATION AND SPECTRAL DENSITY
Introduction

The power spectrum of a time series x(t) describes how the variance of the data x(t) is
distributed over the frequency components into which x(t) may be decomposed. This
distribution of the variance may be described either by a measure or by a statistical
cumulative distribution function S(f) = the power contributed by frequencies from 0 upto
f. Given a band of frequencies [a, b) the amount of variance contributed to x(t) by
frequencies lying within the interval [a,b) is given by S(b) - S(a). Then S is called the
spectral distribution function of x.
The spectral density at a frequency f gives the rate of variance contributed by
frequencies in the immediate neighbourhood of f to the variance of x per unit frequency.
4.1 Auto Correlation of a Random Process
Let X(t1) and X(t2) be the two given random variables. Then auto correlation is
RXX (t1, t2) = E[X(t1) X(t2)]
Mean Square Value
Putting t1 = t2 = t in (1)
RXX (t, t) = E[X(t) X(t)]
2
RXX (t, t) = E[X (t)]
Which is called the mean square value of the random process.
Auto Correlation Function
Definition: Auto Correlation Function of the random process {X(t)} is
RXX = () = E{(t) X(t+)}
Note: RXX () = R() = RX ()
PROPERTY: 1
The mean square value of the Random process may be obtained from the auto correlation
function.
RXX(), by putting = 0.
is known as Average power of the random process {X(t)}.
PROPERTY: 2
RXX() is an even function of .
RXX () = RXX (-)
PROPERTY: 3
If the process X(t) contains a periodic component of the same period.
PROPERTY: 4
If a random process {X(t)} has no periodic components, and E[X(t)] = X then
60

i.e., when , the auto correlation function


represents the square of the mean of the random
process.
PROPERTY: 5

The auto correlation function of


a random process cannot have
SOLVE
D
PROBLCheck whether the following function
are valid auto correlation function (i) 5

1
1

Solution:
(i) Given

R
R
Sin n(
RXX5()
RXX(), the
given function is not an

lim R XX ( ) = X

(or)X = lim RXX ()

|T|

|T|

(ii) Given RXX () = 1 + 92

RXX () = 1 + 9() = RXX ()


The given function is an auto correlation function.
2

Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by

2
R XX ( ) = 18 + 6 + 2
Solution

2
Given R XX ( ) = 18 + 6 + 2
= lim RXX ()
X 2
| |

= lim 18 +

6+
2
2
= 18 + lim 6 +
| |

| |

61

2
= 18 + 6 +

(t)

EX

= 18 + 0
= 18
= 18

18

=
2
2
= E[X (t)] - {E[X(t)]}

Var {X(t)}
We know that
EX

(t)

= RXX(0)

2
55
= 18 + 6 + 0 = 3
= 1

3
Example : 3
Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution
Consider, RXX'(t1, t2) = E{X(t1)X'(t2)}

X (t 2 + h ) X (t2 )
h

= E X (t 1 )lim
n 0

X (t

)X (t 2 + h ) X (t 1 )X (t 2 )
h

= lim E
h 0

= lim

XX

(t 1, t 2 + h ) R X (t, t 2 )
h

h 0

RXX' (t1, t2)


Similarly RXX' (t1, t2)
RX'X (t1, t2)

R XX (t 1 , t2 )
t2
R ' (t, t )
=
XX
2
t1

R XX (t 1 , t2 )
=
=

(1)

by (1)

t, t2
Auto Covariance
The auto covariance of the process {X(t)} denoted by CXX(t1, t2) or C(t1, t2) is defined as

62
MA6451
XX

(1

PROBABILITY AND RANDOM


PROCESSES

=E X

t ,t

( ( 1 ))

t E X t

(2 ) ) }

(2

t EX t

4.2 CORRELATION COEFFICIENT

C XX (t 1 , t2 )

XX (t 1 , t2 ) =

Var X (t 1 ) x Var X (t2 )

Where CXX(t1, t2) denotes the auto covariance.


4.3 CROSS CORRELATION
Cross correlation between the two random process {X(t)} and {Y(t)} is defined as
RXY (t1, t2) = E[X(t1) Y(t2)] where X(t1) Y(t2) are random variables.
4.4 CROSS COVARIANCE
Let {X(t)} and {Y(t)} be any two random process. Then the cross covariance is defined
as
XY ( 1

2)

=E X

t ,t

1)

( 1 ))

( 2

t E Y t

))

t EY t

The relation between Mean Cross Correlation and cross covariance is as follows:

XY

(t, t )
1

=R

XY

(t, t ) E
1

X (t)E Y (t
1

Definition
Two random process {X(t)} and {Y(t)} are said to be uncorrelated if

C XY (t 1 , t 2 )

0, t 1 , t2

Hence from the above remark we have,


RXY (t1, t2) = E[X(t1) Y(t2)]
4.4.1 CROSS CORRELATION COEFFICIENT

XY (t 1 , t2 )

c XY (t 1 , t2 )
Var (X (t 1 ))Var (X (t2 ))

4.4.1 CROSS CORRELATION AND ITS PROPERTIES


Let {X(t)} and {Y(t)} be two random. Then the cross correlation between them is also
defined as
RXY(t, t+) = E X (t )Y (t + )
= RXY ()
PROPERTY : 1
RXY () = RYX ()
PROPERTY : 2

If {X(t)} and {Y(t)} are two random process then R XY ( ) R XX (0 )R YY (0) ,


where RXX() and RYY() are their respective auto correlation functions.
63

PROPERTY : 3
If {X(t)} and {Y(t)} are two random process then,

R XY ( )

R XX (0 ) + R YY (0)
2

SOLVED PROBLEMS ON CROSS CORRELATION


Example:4.4.1
Two random process {X(t)} and {Y(t)} are given by X(t) = A cos (t+), Y(t) = A sin
(t + ) where A and are constants and '' is a uniform random variable over 0 to 2. Find the
cross correlation function.
Solution
By def. we have
RXY() = RXY (t, t+)
Now, RXY (t, t+)
= E[X(t). Y(t+)]
= E [A cos (t + ). A sin ( (t+) + )]
= A 2E sin { ( t + ) + } cos (t + )
Since '' is a uniformly distributed random variable we have
f(0) =

1
2 , 0 2

Now E sin { ( t + ) + } cos (t + )

= sin (t + + ) .cos (wt + ) f ( ) d

= sin (t +

t +

).cos (t

1
+)

= 1

sin ( t + + ) cos (t + ) d

2 0
1 2 1 {sin (t + + + t + )
= 2 0 2
= 1
[ + sin [ t + + t ]}d
2 0
2
sin 2 t + +2 + sin
]

64
MA6451

PROBABILITY AND RANDOM

PROCESSES

cos ( 2 t + + 2 )

+ sin ( )

4
1

2
0
cos (2 t + ) cos (2 t + + 0 )
=

+
+ sin ( 2 0)
4
2
2
cos
2
t
+

cos
2
t
+
)
(
1
(
)
=

+
+ 2 sin
4
2
2
1
= 4 [0 + 2 sin ]

= sin

(3)

Substituting (3) in (1) we get

R XY (t, t ) = 2 sin
2
4.5 SPECTRAL DENSITIES (POWER SPECTRAL DENSITY)
INTRODUCTION
(i) Fourier Transformation
(ii) Inverse Fourier Transform
(iii) Properties of Auto Correlation Function
(iv)Basic Trigonometric Formula
(v) Basic Integration
4.5.1 SPECIAL REPRESENTATION
Let x(t) be a deterministic signal. The Fourier transform of x(t) is defined as

F x (t ) = x (w ) =

x (t )e i tdt

Here X() is called "spectrum of x(t)".


Hence x(t)
= Inverse Fourier Transform of X()

1
=
X ( ) e itd.
2
Definition
The average power P(T) of x(t) over the interval (-T, T) is given by

P (T ) =

2T

T x 2 (t )dt

65

1 X T ( )

(1)

= 2
2T
Definition
The average power PXX for the random process {X(t)} is given by

PXX = lim 1
T

E X 2 (t

2 T

XT ()

lim

) dt
2

2T

(2)

4.6 POWER SPECTRAL DENSITY FUNCTION


Definition
If {X(t)} is a stationary process (either in the strict sense or wide sense) with auto
correlation function RXX(), then the Fourier transform of RXX() is called the power spectral
density function of {X(t)} and is denoted by SXX() or S() or SX().
SXX()= Fourier Transform of RXX ()

= R XX ( ) e id

Thus,

SXX (f ) = R XX

( ) e i2 fd

4.6.1 WIENER KHINCHINE RELATION

SXX ( )

= R

XX

( ) e id

SXX (f ) = R XX

( ) e i2 fd

To find RXX() if SXX() or SXX(f) is given

( ) =

R
XX

1 S

( ) = 1

(or) R
XX

( ) e id

[inverse Fourier transform of SXX()]

XX

(f )e i2 fd

XX

[inverse Fourier transform of SXX(f)]


4.7 PROPERTIES OF POWER SPECTRAL DENSITY FUNCTION
Property 1:
66

MA6451
PROCESSES

PROBABILITY AND RANDOM

The value of the spectral density function at zero frequency is equal to the total area
under the group of the auto correlation function.

SXX (f ) =
R XX ( ) e i2 fcd

Taking f = 0, we get

Sxx(0) = R XX ( ) d

TUTORIAL QUESTIONS
1. Find the ACF of {Y(t)} = AX(t)cos (w0+ ) where X(t) is a zero mean stationary random
process with ACF
A and w0 are constants and is uniformly distributed over (0, 2 ) and
independent of X(t).
2. Find the ACF of the periodic time function X(t) = A sinwt
3.If X(t) is a WSS process and if Y(t) = X(t + a) X(t a), prove that
), where A and are constants and is a random variable, uniformly ),
4. If X(t) = A sin(

Find the A.C.F of {Y(t)} where Y(t) = X (t).

distributed over (-

5.. Let X(t) and Y(t) be defined by X(t) = Acos t + Bsin t and Y(t) = B cos
t Asin t
Where
is a constant and A nd B are independent random variables both having zero mean and
varaince
. Find the cross correlation of X(t) and Y(t). Are X(t)
and Y(t) jointly W.S.S
processes?
6. Two random processes X(t) and Y(t) are given by X(t) = A cos (
), Y(t) = A sin(
), where A and
are constants and
is uniformly distributed over (0, 2
). Find the cross
correlation of X(t) and Y(t) and verify that
.
7..If U(t) = X cos t + Y sin t and V(t) = Y cost + X sint t where X and Y are independent random
2
2
varables such that E(X) = 0 = E(Y), E[X ] = E[Y ] = 1, show that U(t) and V(t) are not jointly
W.S.S but they are individually stationary in the wide sense.
8. Random Prosesses X(t) and Y(t) are defined by X(t) = A cos (
), Y(t) = B cos (
)
where A, B and
are constants and
is uniformly distributed over (0, 2 ). Find the cross
correlation and show that X(t) and Y(t) are jointly W.S.S
WORKEDOUT EXAMPLES
Example 1.Check whether the following function are valid auto correlation function (i) 5 sin n

(ii)

1 + 92

Solution:
(i) Given

RXX() = 5 Sin n
RXX () = 5 Sin n() = 5 Sin n

67

RXX() RXX(), the given function is not an auto correlation function.

1
(ii) Given RXX () = 1 + 92

RXX () = 1 + 9() = RXX ()


The given function is an auto correlation function.
2

Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by

2
R XX ( ) = 18 + 6 + 2
Solution

2
Given R XX ( ) = 18 + 6 + 2
= lim RXX ()
X2
| |

= lim 18 +

6+
= 18 + lim 2
2
| | 6 +
2
= 18 + 6 +

| |

(t)

EX

= 18 + 0
= 18
= 18

18

== E[X (t)] - {E[X(t)]}


Var {X(t)}
We know that
EX

(t)

= RXX(0)

2 = 55
= 18 + 6 + 0
3
1
=

3
Example : 3

68

Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution

(t 2 + h ) X (t2 )

Consider, RXX'(t1, t2) = E{X(t1)X'(t2)}

X
h

= E X (t 1 )lim
n 0

X (t

)X (t 2 + h ) X (t 1 )X (t 2 )
h

= lim E
h 0

= lim

XX

(t 1, t 2 + h ) R X (t, t 2 )
1

h 0

R XX (t 1 , t2 )
t2
R ' (t, t )
=
XX
2
t1
R XX

RXX' (t1, t2)

(1)

Similarly RXX' (t1, t2)


RX'X (t1, t2)

(t 1 , t 2 )

by (1)

t, t2
Example :4
Two
random
process
{X(t)}
and
{Y(t)}
are
given
by
X(t) = A cos (t+), Y(t) = A sin (t + ) where A and are constants and '' is a uniform
random variable over 0 to 2. Find the cross correlation function.
Solution
By def. we have
RXY() = RXY (t, t+)
Now, RXY (t, t+) = E[X(t). Y(t+)]
= E [A cos (t + ). A sin ( (t+) + )]
= A 2E sin { ( t + ) + } cos (t + )
Since '' is a uniformly distributed random variable we have
f(0) =

1 , 0 2
2

Now E sin { ( t + ) + } cos (t + )

= sin (t + + ) .cos (wt + ) f ( ) d

69

= sin (t +

t +

).cos (t + )

1 2 sin ( t + + ) cos (t + ) d

2 0
1 2 1 {sin (t + + + t + )
= 2 0 2

= 1

2 0
2

+ sin [ t + + t ]}d
2

sin 2 t + + 2 + sin

1 cos ( 2 t + + 2 ) + sin ( ) 2
(
4
2
0 )
cos
2
t
+

cos
2
t
+

+
0
(
1
)
2
=

+
+ sin ( 2 0)
4
2
)
(
1 cos ( 2 t + ) + cos 2 t + + 2 sin
=
4
2
2
1
= 4 [0 + 2 sin ]
=

sin

Substituting (3) in (1) we get

R XY (t, t ) = 2 sin
2

(3)

70

UNIT 5
LINEAR SYSTEM WITH RANDOM INPUTS
Introduction

Mathematically a "system" is a functional relationship between the input x(t) and


y(t). We can write the relationship as
y(f) = f[x(t): < + <]
Let x(t) represents a sample function of a random process {X(t)}. Suppose the
system produces an output or response y(f) and the ensemble of the output functions
forms a random process {Y(t)}. Then the process {Y(t)} can be considered as the output
of the system or transformation 'f' with {X(t)} as the input and the system is completely
specified by the operator "f".
5.1 LINEAR TIME INVARIANT SYSTEM
Mathematically a "system" is a functional relationship between the input x(t) and output
y(t). we can write the relationship
y (t ) = f x (t ) : < t <
5.2 CLASSIFICATION OF SYSTEM
1. Linear System: f is called a linear system, if it satisfies

f a X (t ) a x (t
2

1 1

) = af
11

X (t ) a f X
2

(t)

2. Time Invariant System:


Let Y(t) = f[X(t)]
If

Y(t + h) = f X(t + h)

, then f is called a time invariant system or X(t) and Y(t) are said to

form a time invariant system.


3. Causal System:
Suppose the value of the output Y(t) at t = t 0 depends only on the past values of the input
X(t), tt0.
(

t = f X t : t t , then such a system is called a causal


In other words, if Y
system.
4. Memory less System:
If the output Y(t) at a given time t = t 0 depends only on X(t0) and not on any other past or
future values of X(t), then the system f is called memory less system.
5. Stable System:
A linear time invariant system is said to be stable if its response to any bounded input is
bounded.
REMARK:
i) Noted that when we write X(t) we mean X(s,t) where s S, S is the sample space. If the
system operator only on the variable t treating S as a parameter, it is called a deterministic
system.
Input X (t)

Output Y (t)

Linear System

71

h(t)
(a)
Input X (t)

LTI System

Output Y (t)

h(t)
(b)
a) Shows a general single input - output linear system
b) Shows a linear time invariant system
5.3 REPRESENTATION OF SYSTEM IN THE FORM OF CONVOLUTION

Y (t ) = h (t )x X (t)

Y (t ) = h (u ) X (t u )du

= h (t u )X (u )du

5.4 UNIT IMPULSE RESPONSE TO THE SYSTEM


If the input of the system is the unit impulse function, then the output or response is the
system weighting function.
Y(t) = h(t)
Which is the system weight function.
5.4.1 PROPERTIES OF LINEAR SYSTEMS WITH RANDOM INPUT
Property 1:
If the input X(t) and its output Y(t) are related by

Y (t ) =
h (u )X (t u )du , then the system is a linear time - invariant system.

Property 2:
If the input to a time - invariant, stable linear system is a WSS process, then the output
will also be a WSS process, i.e To show that if {X(t)} is a WSS process then the output {Y(t)} is
a WSS process.
Property 3:

If

{X(t)}

is

WSS

process

and

if

Y(t)

h (u )X (t

)du

, then

)du

, then

R XY ( ) = R XX ( ) x h ()
Property 4 :

If {(X(t)}

is

WSS

process

and

R YY ( ) = R XY ( ) x h ()
72

if

Y(t)

h (u )X (t

MA6451
PROCESSES

PROBABILITY AND RANDOM

Property 5:

If

{X(t)}

is

WSS

process

and

if

Y(t)

h (u )X (t

)du

, then

R YY ( ) = R XX ( ) x h ( ) x h ()
Property 6:

If {X(t)} is a WSS process if Y(t) = h (u )X (t u )du , then SXY ( ) = SXX ( ) H()

. Property 7:

If

{X(t)}

is a

WSS process

and

if

Y(t) =

h (u )X (t u )du ,

then

SYY ( ) = SXX ( ) H()

Note:

XY

Instead of taking R
XY (

( )
( ) (
)
( = E X t Y t +
)

in properties (3), (4) & (5), if we start

= E X t Y t , then the above property can also stated as


a) R XY ( ) = R XY ( ) xh ()
with R

b) R YY ( ) = R XY ( ) xh ()
c) R YY ( ) = R XX ( ) x h ( ) x h ()
REMARK :
(i) We have written H ( ) H * ( ) = H() 2 because
H ( ) = F h ( )
H
I

* ( ) = F h ( )

= F (h ())

= H()
(ii) Equation (c) gives a relationship between the spectral densities of the input and output process in
the system.
(iii) System transfer function:
We call H ( ) = F {h ()} as the power transfer function or system transfer function.
SOLVED PROBLEMS ON AUTO CROSS CORRELATION FUNCTIONS OF INPUT
AND OUTPUT
Example :5.4.1
Find the power spectral density of the random telegraph signal.
Solution
We know, the auto correlation of the telegraph signal process X(y) is
73

MA6451
PROCESSES

PROBABILITY AND RANDOM

R XX ( ) = e2

The power spectral density is

SXX ( ) = R XX ( ) e id

= e 2 e id + e 2 e i d

= when < 0
= when > 0

e ( 2 i ) d + e0 2 e id

e ( 2 i )

( 2 i )
1

= (2 i ) e
=

e ( 2 + i )

( 2 + i ) 0
1

(2 + i ) e
(0 1)
1
1
(2 i ) (1 0 ) (2 + i )
1

= (2 i ) + (2 + i )

1
= (2 i ) (1 0) +

2 + i + 2 i
= (2 i )(2 + i )

4
SXX ( ) = 4 2 + 2
Example : 5.4.2
A linear time invariant system has a impulse response h (t ) = e t U (t). Find the
power spectral density of the output Y(t) corresponding to the input X(t).
Solution:
Given X(t) - Input
Y(t) - output
2
SYY() - |H()| SXX()

74

MA6451

PROBABILITY AND RANDOM PROCESSES

Now H ( ) = h (t )e i tdt

= h (t )e

it

dt + e

e itdt

= 0+ e ( + i)tdt
0

e ( + i)t

( + i ) 0

(e

e 0

e e 0
1
= ( + i )
1
= +i

= ( + i )

|H()|

1
= +i
1
=

2 + 2
1

SYY ( ) = 2 + 2 SXX ()
TUTORIAL QUESTIONS
1. State and Prove Power spectral density of system response theorem.
2. Suppose that X(t) is the input to an LTI system impulse response h1(t) and that Y(t) is the
input to another LTI system with impulse response h 2 (t). It is assumed that X(t) and Y(t) are
jointly wide sense stationary. Let V(t) and Z(t) denote that random processes at the respective
system outputs. Find the cross correlation of X(t) and Y(t).
3. The input to the RC filter is a white noise process with ACF
. If the
frequency response
find the auto correlation and the mean square value of the
output process Y(t).
4. A random process X(t0 having ACF
, where P and are real positive
t ,t > 0
where
e
constants, is applied to the input of the system with impulse response h(t) =
0,t < 0
75

MA6451
PROCESSES

PROBABILITY AND RANDOM

is a real positive constant. Find the ACF of the networks response Y(t). Find the cross
correlation .
WORKEDOUT EXAMPLES
Example: 1
Find the power spectral density of the random telegraph signal.
Solution
We know, the auto correlation of the telegraph signal process X(y) is
R XX ( ) = e2

The power spectral density is

SXX ( ) = R XX ( ) e id

= e 2 e id + e 2 e i d

= when < 0
= when > 0

e ( 2 i ) d + 0e 2 e id

e ( 2 i )

( 2 i )
1

= (2 i )
=

e ( 2 + i )

( 2 + i ) 0
1

(2 + i )
(0 1)
1
1
(2 i ) (1 0 ) (2 + i )
1

= (2 i ) + (2 + i )

1
= (2 i ) (1 0) +

2 + i + 2 i
= (2 i )(2 + i )

4
SXX ( ) = 4 2 + 2
76

Probability & Random Process


Formulas
UNIT-I (RANDOM VARIABLES)
1) Discrete random variable:
A random variable whose set of possible values is either finite or
countably infinite is called discrete random variable.
Eg: (i) Let X represent the sum of the numbers on the 2 dice, when
two dice are thrown. In this case the random variable X takes the values
2, 3, 4, 5, 6, 7, 8, 9, 10, 11 and 12. So X is a discrete random variable.
(ii) Number of transmitted bits received in error.
2) Continuous random variable:
A random variable X is said to be continuous if it takes all possible
values between certain limits.
Eg: The length of time during which a vacuum tube installed in a
circuit functions is a continuous random variable, number of scratches
on a surface, proportion of defective parts among 1000 tested, number
of transmitted in error.
3)

Sl.No. Discrete random variable

Continuous random variable

p ( xi ) 1

f ( x )dx 1

F ( x) P X x

F ( x) P X x f ( x )dx

Mean E X x

Mean E X xf ( x )dx

p ( xi )

i
i

E X

5
6

p ( xi )

xi
i

Var X E X
Moment = E X r

E X
2

M.G.F

f ( x )dx

E X
xr p

Var X E X
Moment = E
M.G.F

E X

r
X

f ( x )dx

MX

t=E

tX

tx

= e p ( x)
x

tX

MX t=E e

tx

=e

f ( x )dx

4) E aX + b = aE X + b
5) Var aX + b = a2 Var X
6) Var aX bY = a2Var X + b 2Var Y
7) Standard Deviation = Var X
8) f ( x) = F ( x)
9) p( X > a) = 1 - p( X a)
10) p A / B = p A B , p B
p B

11) If A and B are independent, then p A B = p A p B .


12) 1 Moment about origin = E X = M X t
st

(Mean)

t 0

nd

2 Moment about origin = E

t
The co-efficient of

=E X

(r

th

t 0

Moment about the origin)

r!
13) Limitation of M.G.F:
i)
A random variable X may have no moments although its m.g.f exists.
ii)
A random variable X can have its m.g.f and some or all moments, yet the
m.g.f does not generate the moments.
iii)
A random variable X can have all or some moments, but m.g.f does not
exist except perhaps at one point.
14) Properties of M.G.F:
i)
ii)
iii)

If Y = aX + b, then MY

t = e bt M X at .

McX t = M X ct , where c is constant.


If X and Y are two independent random variables then
M X Y

t = M X t MY t .

15) P.D.F, M.G.F, Mean and Variance of all the distributions:


Sl.
Distributio
M.G.F
P.D.F ( P ( X = x) )
No. n
1

Binomial

nc x p x qn x

q +
pe

Poisson

e x

Geometric

x!
q x 1 p (or) q x p

e 1
e t

pet

Mean

Variance

np

npq

1 - qet

p2

Uniform

f ( x) b

Exponential

Gamma

(b a)

(b a )
t

, x 0, 0
otherwise

0,
6

,axb

f ( x)

at

e e

12

otherwise

0,
5

bt

a
b

e x x 1

t
1

, 0 x , 0

(1 t )

f ( x) ( )
7

Normal

1 x

1
f ( x)

2 2

et

2
e

16) Memoryless property of exponential distribution


P X S t/X S P X t

.
dx
d
17) Function of random variable: fY ( y) f X ( x) y

UNIT-II (RANDOM VARIABLES)

pij

1)

1 (Discrete random variable)

f ( x , y )dxdy 1

(Continuous random variable)

P x,y
2) Conditional probability function X given Y P X xi / Y yi
.
P ( y)
Conditional probability function Y given X P Y yi / X xi P x , y .
P ( x)
P X a / Y b P X a ,Y

P (Y b)

3) Conditional density function of X given Y,

f ( x / y) f ( x , y) .
f ( y)

Conditional density function of Y given X,

f ( y / x) f ( x , y) .
f ( x)

4) If X and Y are independent random variables then


f ( x , y) f ( x). f ( y)

(for continuous random variable)

P X = x ,Y = y = P X = x .P Y = y (for discrete random variable)


d

5) Joint probability density function P a X b , c Y d = f ( x , y )dxdy .


c

b a

P X < a ,Y < b =

f ( x , y )dxdy
0 0

6) Marginal density function of X, f ( x) = f X ( x) =

f ( x , y )dy

f ( x , y )dx

Marginal density function of Y, f ( y) = fY ( y) =

7) P ( X + Y 1) = 1 - P ( X + Y < 1)
Cov ( X ,Y )
X
8) Correlation co efficient (Discrete): ( x , y) =

1
Cov ( X ,Y ) = XY - XY ,
n

, Y = 1

- X

Y 2 -Y

Cov ( X ,Y )
9) Correlation co efficient (Continuous): ( x , y) = X Y
Cov ( X ,Y ) = E X ,Y - E X E Y , X = Var ( X ) , Y

= Var (Y )

10) If X and Y are uncorrelated random variables, then Cov ( X ,Y ) = 0 .

11) E X = xf ( x )dx , E Y = yf ( y )dy , E

X ,Y = xyf ( x , y )dxdy .

12) Regression for Discrete random variable:

Regression line X on Y is x - x = bxy y - y ,

b
xy

x - y x

y -

x - x

Regression line Y on X is y - y = byx x - x ,

yx

x x

Correlation through the regression, = bXY .bYX Note: ( x , y) = r ( x , y)


13) Regression for Continuous random variable:
Regression line X on Y is x E ( x) bxy

y E ( y) ,

Regression line Y on X is y E ( y) byx x E ( x) ,

Regression curve X on Y is x E x / y xf x /

r y

b y
yx r
x
xy

y dx

Regression curve Y on X is y E y / x yf y / x dy
14) Transformation Random Variables:
(One dimensional random variable)
dx
fY ( y) f X ( x)
d
u
y
y

u
(Two dimensional random variable)
v

y
x
f
fUV ( u , v) XY ( x , y)
v

x
15) Central limit theorem (Liapounoffs form)
2

If X1, X2, Xn be a sequence of independent R.Vs with E[X i] = i and Var(Xi) = i ,


i = 1,2,n and if Sn = X1 + X2 + + Xn then under certain general conditions, Sn
n

follows a normal distribution with mean


as

i
i 1

and variance

i2

i 1

n .
16) Central limit theorem (Lindberg Levys form)
If X1, X2, Xn be a sequence of independent identically distributed R.Vs with E[X i]
2

= i and Var(Xi) = i , i = 1,2,n and if Sn = X1 + X2 + + Xn then under certain


general conditions, Sn follows a normal distribution with mean n and variance

n 2 as n .

Sn
n
( for n variables),z
Note: z n

( for single variables)

UNIT-III (MARKOV PROCESSES AND MARKOV CHAINS)


1) Random Process:
A random process is a collection of random variables {X(s,t)} that
are functions of a real variable, namely time t where s S and t T.

2) Classification of Random Processes:


We can classify the random process according to the characteristics of
time t and the random variable X. We shall consider only four cases based on
t and X having values in the ranges -< t < and - < x < .
Continuous random process
Continuous random sequence
Discrete random process
Discrete random sequence
Continuous random process:
If X and t are continuous, then we call X(t) , a Continuous Random
Process. Example: If X(t) represents the maximum temperature at a
place in the interval (0,t), {X(t)} is a Continuous Random Process.
Continuous Random Sequence:
A random process for which X is continuous but time takes only discrete
values is called a Continuous Random Sequence.
Example: If Xn represents the temperature at the end of the nth hour of a day,
then {Xn, 1n24} is a Continuous Random Sequence.
Discrete Random Process:
If X assumes only discrete values and t is continuous, then we call such
random process {X(t)} as Discrete Random Process.
Example: If X(t) represents the number of telephone calls received in the
interval (0,t) the {X(t)} is a discrete random process since S = {0,1,2,3, . . . }
Discrete Random Sequence:
A random process in which both the random variable and time are discrete is called

Discrete Random Sequence.


Example: If Xn represents the outcome of the nth toss of a fair die, the {X n :
n1} is a discrete random sequence. Since T = {1,2,3, . . . } and S = {1,2,3,4,5,6}

3) Condition for Stationary Process: E X ( t ) Constant ,Var X ( t ) constant .

If the process is not stationary then it is called evolutionary.


4) Wide Sense Stationary (or) Weak Sense Stationary (or) Covariance Stationary:
A random process is said to be WSS or Covariance Stationary if it satisfies the
following conditions.
i) The mean of the process is constant (i.e) E X ( t ) constant .

ii) Auto correlation function depends only on (i.e)


RXX ( ) E X ( t ).X (t )

5) Time average:
T

The time average of a random process X ( t ) is defined as XT 2 T X ( t ) dt .


T
T

If the interval is 0,T , then the time average is XT T 0 X ( t ) dt .


6) Ergodic Process:
A random process X ( t ) is called ergodic if all its ensemble averages

are interchangeable with the corresponding time average XT .

7) Mean ergodic:
Let X ( t ) be a random process with mean E X ( t ) and time average
XT ,

T as T (i.e)

then X ( t ) is said to be mean ergodic if X


E X ( t ) Lt X T .
T

Lt var
Note: X
T

0 (by mean ergodic theorem)

8) Correlation ergodic process:


The stationary process X ( t ) is said to be correlation ergodic if the process

Y ( t ) is mean ergodic where Y ( t ) X ( t ) X (t ) . (i.e) E Y ( t ) Lt Y .


Where YT is the time average of Y ( t ) .
9) Auto covariance function:
C XX ( ) RXX ( ) E X ( t ) E X (t )
10) Mean and variance of time average:
1 T E X ( t ) dt
Mean:

Variance:

E X

Var X

1
2T

2T

2T

RXX ( )C XX ( ) d

11) Markov process:


A random process in which the future value depends only on the present value
and not on the past values, is called a markov process. It is symbolically

represented by P X (t

n1

)
x

n1

)
x

/X
(t
n

(
, Xt
n

P X (t

n 1

n1

)
x

)
)
x n1 ... X (t 0 x

n 1

Where t0 t1 t2 ... tn tn1


12)Markov Chain:
If for all n ,

P X a /X a
,X
a
, ... X a P X a
n

n1

n1

n2

n2

)
nx

X
/ (t

/X

n1

n1

then the process X n , n 0,1, 2, ... is called the markov chain.


Where a0 , a1 , a2 , ...an , ... are called the states of the markov chain.
13)Transition Probability Matrix (tpm):
When the Markov Chain is homogenous, the one step transition probability
is denoted by Pij. The matrix P = {Pij} is called transition probability matrix.
14)Chapman Kolmogorov theorem:
If P is the tpm of a homogeneous Markov chain, then the n step tpm P(n) is
n

equal to P . (i.e) Pij

(n)

P .
ij

15)Markov Chain property: If 1 , 2 , 3 , then P and


1 2 3 1 .
16)Poisson process:
If X ( t ) represents the number of occurrences of a certain event in (0, t
) ,then the discrete random process X ( t ) is called the Poisson
process, provided the following postulates are satisfied.
(i) P 1 occurrence in ( t , t t ) t O t
(ii) P 0 occurrence in ( t , t t ) 1 t O t
(iii) P 2 or more occurrences in ( t , t t ) O t
(iv) X ( t ) is independent of the number of occurrences of the event in any interval.

17) Probability law of Poisson process: P


Mean E X ( t ) t , E
X

e t t x
, x 0,1, 2, ...
X(t)x
x!

2 2

( t )

t t , Var X ( t ) t .

UNIT-IV (CORRELATION AND SPECTRAL DENSITY)


RXX - Auto correlation function

SXX - Power spectral density (or) Spectral density


RXY - Cross correlation function
SXY - Cross power spectral density
1) Auto correlation to Power spectral density (spectral density):

SXX RXX ei d

2) Power spectral density to Auto correlation:

AX

i
S e d

XX

3) Condition for X ( t ) and X (t ) are uncorrelated random process is

C XX ( ) RXX ( ) E X ( t ) E X (t ) 0
4) Cross power spectrum to Cross correlation:

XY

S ei d
2 XY

5) General formula:

eax

e cos bx dx a b
2

a cos bx b sin bx
2

eax

a sin bx b cos bx

i)

ii)

iii)

ax

eax sin bx dx

x ax x

a
2

ei ei

iv)

sin

v)

cos ei ei
2

2i

a 2
2

a2
4

UNIT-V (LINEAR SYSTEMS WITH RANDOM INPUTS)


1) Linear system:
f is called a linear system if it satisfies
(t)
f aX a

X ( t ) a f X (t) a

1 1
2
2
1
1
2) Time invariant system:

Let Y ( t ) f

f X (t)
2

X ( t ) . If Y (t h) f X (t h) then f is called a

time invariant system.


3) Relation between input X ( t ) and output Y ( t ) :

Y ( t ) h( u ) X (t u ) du

Where h( u) system weighting function.


4) Relation between power spectrum of X ( t ) and output Y ( t ) :
S ( ) S XX ( ) H ( )
YY

If H ( ) is not given use the following formula H ( ) e

j t

5) Contour integral:

eimx

a
6)

F 1

ma
(One of the result)

ea

2a

(from the Fourier transform)

h( t ) dt

UNIT I: PROBABILITY AND RANDOM VARIABLES


PART B QUESTIONS
1. A random variable X has the following probability distribution
X
0
1
2
3
4
5
6
7
2
2
2
k
2k
7k +k
P(X)
0
K
2k
2k
3k
Find (i) The value of k, (ii) P[ 1.5 < X < 4.5 / X >2 ] and (iii) The smallest value of
for which P(X ) < (1/2).
2. A bag contains 5 balls and its not known how many of them are white. Two balls
are drawn at random from the bag and they are noted to be white. What is the
chance that the balls in the bag all are white.
1

3. Let the random variable X have the PDF f(x) = 2 e 2 , x >0 Find the moment
generating function, mean and variance.
4. A die is tossed until 6 appear. What is the probability that it must
tossed more than 4 times.
5. A man draws 3 balls from an urn containing 5 white and 7 black
balls. He gets Rs. 10 for each white ball and Rs 5 for each black
ball. Find his expectation.
6. In a certain binary communication channel, owing to noise, the
probability that a transmitted zero is received as zero is 0.95 and the
probability that a transmitted one is received as one is 0.9. If the
probability that a zero is transmitted is 0.4, find the probability that (i) a
one is received (ii) a one was transmitted given that one was received
th
x
7. Find the MGF and r moment for the distribution whose PDF is f(x) = k e ,
x >0. Find also standard deviation.
8. The first bag contains 3 white balls, 2 red balls and 4 black balls. Second
bag contains 2 white, 3 red and 5 black balls and third bag contains 3
white, 4 red and 2 black balls. One bag is chosen at random and from it 3
balls are drawn. Out of 3 balls, 2 balls are white and 1 is red. What are
the probabilities that they were taken from first bag, second bag and third
bag.
9. A random variable X has the PDF f(x) = 2x, 0 < x < 1 find (i) P (X < )
(ii) P ( < X < ) (iii) P ( X > / X > )
10. If the density function of a continuous random variable X is given by
ax
0x1
a
1x2
f(x) = 3a ax
2x3
0
otherwise
(1) Find a (2) Find the cdf of X
r

11. If the the moments of a random variable X are defined by E ( X ) =


0.6, r = 1,2,.. Show that P (X =0 ) = 0.4 P ( X = 1) = 0.6, P ( X 2 ) =
0.

12. In a continuous distribution, the probability density is given by f(x) = kx


(2
x) 0 < x < 2. Find k, mean , varilance and the distribution function.
13. The cumulative distribution function of a random variable X is given by
0,
x<0
2
x ,
0x
3
(3 x)2
F(x) = 1 x3
25
x 3
1
Find the pdf of X and evaluate P ( |X| 1 ) using both the pdf and cdf
14. Find the moment generating function of the geometric random variable
with the pdf f(x) = p q

x-1

, x = 1,2,3.. and hence find its mean and variance.

15. A box contains 5 red and 4 white balls. A ball from the box is taken our
at random and kept outside. If once again a ball is drawn from the box,
what is the probability that the drawn ball is red?
16. A discrete random variable X has moment generating function
1 3 t 5
e Find E(x), Var(X) and P (X=2)
M X(t) =
4 4
17. The pdf of the samples of the amplitude of speech wave foem is found to
decay exponentially at rate , so the following pdf is proposed f(x) = Ce|
x|
, - < X < . Find C, E(x)

18. Find the MGF of a binomial distribution and hence find the mean and variance.
. Find the recurrence relation of central moments for a binomial distribution.
19. The number of monthly breakdowns of a computer is a RV having a poisson
distribution with mean equal to 1.8. Find the probability that this computer will
function for a month (a) without a breakdown, (b) Wish only one breakdown, (c)
Wish at least one break down.
20. Find MGF and hence find mean and variance form of binomial distribution.
21. State and prove additive property of poisson random variable.
22. If X and Y are two independent poisson random variable, then show that
probability distribution of X given X+Y follows binomial distribution.
23. Find MGF and hence find mean and variance of a geometric distribution.
24. State and prove memory less property of a Geometric Distribution.

25. Find the mean and variance of a uniform distribution.

UNIT- II TWO DIMENSIONAL RANDOM VARIABLES


Part- B
1. If f (x, y) = x+y , 0< x <1, 0< y < 1
0 , Otherwise
Compute the correlation cp- efficient between X and Y.
2. The joint p.d.f of a two dimensional randm variable (X, Y) is given by f(x, y) =
(8 /9) xy, 1 x y 2 find the marginal density unctions of X and Y. Find
also the conditional density function of Y / X =x, and X / Y = y.
3. The joint probability density function of X and Y is given by f(x, y) = (x + y) /3, 0 x
1 & 0<y <2 obtain the egression of Y on X and of X on Y.

-2x1 3 x2

4. If the joint p.d.f. of two random variable is given by f(x1, x2) = 6 e


, x1
> 0, x2 >0. Find the probability that the first random variable will take on a value
between 1 and 2 and the second random variable will take on avalue between 2
and 3. \also find the probability that the first random variable will take on a value
less than 2 and the second random variable will take on a value greater than 2.

5. If two random variable have hoing p.d.f. f(x1, x2) = ( 2/ 3) (x1+ 2x2) 0< x1
<1, 0< x2 < 1
6. Find the value of k, if f(x,y) = k xy for 0 < x,y < 1 is to be a joint density
function. Find P(X + Y < 1 ) . Are X and Y independent.

7. If two random variable has joint p.d.f. f(x, y) = (6/5) (x +y ), 0 < x < 1 ,
0< y <1.Find P(0.2 < X < 0.5) and P( 0.4 < Y < 0.6)

8. Two random variable X and Y have p.d.f f(x, y) = x + ( xy / 3), 0 x 1,


0 y 2. Prove that X and Y are not independent. Find the conditional
density function

9. X and Y are 2 random variable joint p.d.f. f(x, y) = 4xy e


2

x 2 y2

, x ,y 0, find

the p. d. f. of x +y .
10. Two random variable X and Y have joint f(x y) 2 x y, 0< x <1, 0< y < 1. Find
the Marginal probability density function of X and Y. Also find the conditional
density unction and covariance between X and Y.
11. Let X and Y be two random variables each taking three values 1, 0 and
1 and having the joint p.d.f.
X
Y
Prove that X and Y have different
-1
0
1
expections. Also Prove that X and Y are
-1
0
0.1
0.1
uncorrelated and find Var X and Var Y
0
0.2
0.2
0.2
1
0
0.1
0.1

12. 20 dice are thrown. Find the approximate probability tat the sum obtained
is between 65 and 75 using central limit theorem.
13. Examine whether the variables X and Y are independent whose joint density
.

is f(x ,y) = x e

xy x

, 0< x , y <

14. Let X and Y be independent standard normal random variables. Find the pdf
of z =X / Y.
15. Let X and Y be independent uniform random variables over (0,1) . Find
the PDF of Z = X + Y
UNIT-III CLASSIFICATION OF RANDOM PROCESS
PART B
1. The process { X(t) } whose proabability distribution is given by

P [ X(t) = n] =

a
t n1

n 1

, n 1, 2...

1 at
a
t , n 0

Show that it is not stationary

1 at

2. A raining process is considered as a two state Markov chain. If it rains, it is


considered to be in state 0 and if it does not rain, the chain is in state 1.
the
0.6 0.4
transitioin probability of the markov chain is defined as P
. Find the
0.2 0.8
probability of the Markov chain is defined as today assuming that it is raining
today. Find also the unconditional probability that it will rain after three days with
the initial probabilities of state 0 and state 1 as 0.4 and 0.6 respectively.
2

3. Let X(t) be a Poisson process with arrival rate . Find E {( X (t) X (s) } for t > s.
4. Let { Xn ; n = 1,2..} be a Markov chain on the space S = { 1,2,3} with on step

0 1 0

transition probability matrix P 1/ 2


0 1/ 2 (1) Sketch transition diagram (2) Is

1 0 0

the chain irreducible? Explain. (3) Is the chain Ergodic? Explain

5. Consider a random process X(t) defined by X(t) = U cost + (V+1) sint, where U
2
and V are independent random variables for which E (U ) = E(V) = 0 ; E (U ) = E

( V ) = 1 (1) Find the auto covariance function of X (t) (2) IS X (t) wide sense
stationary? Explain your answer.
6. Discuss the pure birth process and hence obtain its probabilities, mean and
variance.
7. At the receiver of an AM radio, the received signal contains a cosine carrier
signal at the carrier frequency with a random phase that is uniform
distributed over ( 0,2). The received carrier signal is X (t) = A cos(t + ). Show
that the process is second order stationary
8. Assume that a computer system is in any one of the three states busy, idle and
under repair respectively denoted by 0,1,2. observing its state at 2 pm each day,
0.6 0.2 0.2

rd
we get the transition probability matrix as P
0.1 0.8 0.1 . Find out the 3

0.6
0
0.4

step transition probability matrix. Determine the limiting probabilities.

9. Given a random variable with density f () and another random variable


uniformly distributed in (-, ) and independent of and X (t) = a cos (t
+ ), Prove that { X (t)} is a WSS Process.
10. A man either drives a car or catches a train to go to office each day. He never
goes 2 days in a row by train but if he drives one day, then the next day he is just
as likely to drive again as he is to travel by train. Now suppose that on the first
day of the week, the man tossed a fair die and drove to work iff a 6 appeared.

Find (1) the probability that he takes a train on the 3


that he drives to work in the long run.

rd

day. (2) the probability

11. Show that the process X (t) = A cost + B sin t (where A and B are random
2

variables) is wide sense stationary, if (1) E (A ) = E(B) = 0 (2) E(A ) = E (B ) and


E(AB) = 0
12. Find probability distribution of Poisson process.
13. Prove that sum of two Poisson processes is again a Poisson process.
14. Write classifications of random processes with example

Unit 4
Correlation and Spectrum Densities

V+ TEAM

V+
TEAM

Das könnte Ihnen auch gefallen