Beruflich Dokumente
Kultur Dokumente
σX2
P { X − µX ≥ ε} ≤ 2
ε
Contd..
∞
σx2 = ∫ (x − µX )2 fX (x )dx
−∞
∫
2
≥ (x − µX ) fX (x )dx
X −µX ≥ε
≥ ∫ ε2 fX (x )dx
X −µX ≥ε
= ε2P { X − µX ≥ ε}
σX2
∴ P { X − µX ≥ ε} ≤ 2
ε
Markov Inequality
For a random variable X which take only
non-negative values P {X ≥ a} ≤ E (X )
a
∞
where a > 0.
E (X ) = ∫ xf X (x )dx
0
∞ ∞
= aP {X ≥ a }
E (X )
∴ P {X ≥ a } ≤
a
2
E (X − k )
Result: P {(X − k )2 ≥ a } ≤
a
Convergence of a sequence of
random variables
Let X1 , X 2 ,..., X n be a sequence n independent and
identically distributed random variables. Suppose we want
to estimate the mean of the random variable on the basis
of the observed data by means of the relation
1 N
µˆX = ∑ X i
n i =1
How closely does µˆX represent µX as n is increased?
How do we measure the closeness between µˆX and µX ?
Notice that µˆX is a random variable. What do we mean by
the statement µˆX converges to µX ?
Contd..
Consider a deterministic sequence x1, x 2,....xn .... The
sequence converges to a limit x if correspond to any
ε > 0 we can find a positive integer m such that
x − xn < ε for n > m.
Convergence of a random sequence X1, X 2,....Xn cannot be
defined as above.
E( ∑
n i =1
Xi − µX ) = E( (∑ (Xi − µX ))
2
n i =1
2
1 N 1 n n
= 2 ∑ E(Xi − µX ) + 2 ∑ ∑ E(X
2
i
− µX )(X j − µX )
n i =1 n i=1 j=1,j≠i
2
nσX
= 2
+0 ( Because of independence)
n
2
σX
=
n
1 N
∴ lim E(
n →∞ n
∑ Xi − µX ) = 0 2
i =1
Convergence in probability
P { X n − X > ε} is a sequence of probability. X n is said to
convergent to X in probability if this sequence of probability is
convergent that is P { Xn − X > ε} → 0 as n → ∞.
where A =
1
2 πσx σy 1−ρX2 ,Y
Properties :
(1) If X and Y are jointly Gaussian, then for any
constants a and b, then the random variable Z ,
given by Z = aX + bY is Gaussian with
mean µZ = a µX + bµY
2 2 2 2 2
and variance Z σ = a σ X + b σY + 2abσX σY ρX ,Y
Contd..
(2) If two jointly Gaussian r.v.s. are uncorrelated, ρX ,Y = 0
then they are statistically independent.
fX ,Y (x , y ) = fX (x )fY (y ).