Beruflich Dokumente
Kultur Dokumente
9 CONVERGENCE IN PROBABILITY
Convergence in probability
The idea is to extricate a simple deterministic component out of a random situation. This is
typically possible when a large number of random effects cancel each other out, so some limit
is involved. The general situation, then, is the following: given a sequence of random variables,
Y1 , Y2 , . . ., we want to show that, when n is large, Yn is approximately f (n) for some simple
deterministic function f (n). The meaning of approximately is what we now make clear.
= P(
i=1
1
n k
2
For the lower bound, divide the string of size n into disjoint blocks of size k. There is nk
such blocks (if n is not divisible by k, simply throw away the leftover smaller block at the end).
Then Rn k as soon as one of the blocks consists of Heads only, and different blocks are
independendent. Therefore,
P (Rn < k)
1
1 k
2
n
k
1 jnk
,
exp k
2 k
112
9 CONVERGENCE IN PROBABILITY
using the famous inequality 1 x ex , valid for all x.
Below, we will use these trivial inequalities, valid for any real number x 2: x x 1,
x x + 1, x 1 x2 , and x + 1 2x.
To demonstrate that
Rn
log2 n
(1)
P (Rn (1 + ) log2 n) 0,
(2)
P (Rn (1 ) log2 n) 0,
as
Rn
Rn
Rn
P
1
1 + or
1
= P
log2 n
log2 n
log2 n
Rn
Rn
1+ +P
1
= P
log2 n
log2 n
= P (Rn (1 + ) log2 n) + P (Rn (1 ) log2 n) .
A little bit of fussing in the proof comes from the fact that (1 e) log2 n are not integers. This
common in problems such as this. To prove (1), we plug k = (1 + ) log2 n into the upper
bound to get
P (Rn (1 + ) log2 n) n
= n
=
1
2(1+) log2 n1
2
n1+
2
0
n
as n . One the other hand, to prove (2) we need to plug k = (1 ) log2 n + 1 into the
lower bound,
P (Rn (1 ) log2 n)
P (Rn < k)
1 jnk
exp k
2 k
1 n
exp k
1
2 k
1
n
1
exp 1
32 n
(1 ) log2 n
1
n
= exp
32 (1 ) log2 n
0,
113
9 CONVERGENCE IN PROBABILITY
2
,
k2
for any k > 0. We proved this inequality in the previous chapter, and we will use it to prove the
next theorem.
Theorem 9.1. Connection between variance and convergence in probability.
Assume that Yn are random variables and a is a constant such that
EYn a,
Var(Yn ) 0,
as n . Then
Yn a,
as n , in probability.
Proof. Fix an > 0. If n is so large that
|EYn a| < /2,
then
P (|Yn a| > )
n
X
EXi2 +
i=1
n
X
Var(Sn ) =
i=1
E(Xi Xj ),
i6=j
Var(Xi ) +
X
i6=j
Cov(Xi , Xj ).
114
9 CONVERGENCE IN PROBABILITY
n
in probability.
Proof. Let Yn =
Sn
n .
1
2
1
2
Var(S
)
=
n
=
.
n
n2
n2
n
115
9 CONVERGENCE IN PROBABILITY
It is important to note that the expected value of the capital at the end of the year is
maximized when x = 1, but using this strategy you will eventually lose everything. Let Xn be
your capital at the end of year n. Define the average growth rate of your investment as
= lim
Xn
1
log
,
n
x0
so that
Xn x0 en .
We will express in terms of x; in particular, we will show it is a nonrandom quantity.
Let Ii = I{stock goes up in year i} . These are independent indicators with EIi = 0.8.
Xn = Xn1 (1 x) 1.06 + Xn1 x 1.5 In
= Xn1 (1.06(1 x) + 1.5x In )
and so we can unroll the recurrence to get
Xn = x0 (1.06(1 x) + 1.5x)Sn ((1 x)1.06)nSn ,
where Sn = I1 + . . . + In . Therefore,
1
Xn
log
n
x0
Sn
Sn
=
log(1.06 + 0.44x) + 1
log(1.06(1 x))
n
n
0.8 log(1.06 + .44x) + 0.2 log(1.06(1 x)),
7
22 ,
Example 9.3. Distribute n balls independently at random into n boxes. Let Nn be the number
of empty boxes. Show that n1 Nn converges in probability and identify the limit.
Note that
Nn = I1 + . . . + In ,
where Ii = I{ith box is empty} , but you cannot use the weak law of large numbers as Ii are not
independent. Nevertheless,
n1 n
1 n
= 1
,
EIi =
n
n
and so
1 n
.
ENn = n 1
n
116
9 CONVERGENCE IN PROBABILITY
Moreover,
E(Nn2 ) = ENn +
E(Ii Ij )
i6=j
with
E(Ii Ij ) = P (box i and j are both empty) =
n2
n
n
so that
Var(Nn ) =
E(Nn2 )
2 n
1 n
1 2n
2
+ n(n 1) 1
n 1
.
(ENn ) = n 1
n
n
n
2
EYn e1
as n , and
Var(Yn )
1
n
1
1
n
n
n1
+
n
2
1
n
n
1
1
n
2n
0 + e2 e2 = 0,
as n . Therefore
Yn =
Nn
e1 ,
n
as n , in probability.
Problems
1. Assume that n married couples (amounting to 2n people) are seated at random on 2n seats
around a table. Let T be the number of couples that sit together. Determine ET and Var(T ).
2. There are n birds that sit in a row on a wire. Each bird looks left or right with equal
probability. Let N be the number of birds not seen by any neighboring bird. Determine, with
proof, the constant c so that, as n , n1 N c in probability.
3. Recall the coupon collector problem: sample from n cards, with replacement, indefinitely,
and let N be the number of cards you need to get each of n different cards are represented. Find
a sequence an so that, as n , N/an converges to 1 in probability.
4. Kings and Lakers are playing a best of seven playoff series, which means they play until
one team wins four games. Assume Kings win every game independently with probability p.
117
9 CONVERGENCE IN PROBABILITY
(There is no difference between home and away games.) Let N be the number of games played.
Compute EN and Var(N ).
5. An urn contains n red and m black balls. Pull balls from the urn one by one without
replacement. Let X be the number of red balls you pull before any black one, and Y the
number of red balls between the first and the second black one. Compute EX and EY .
Solutions to problems
1. Let Ii be the indicator of the event that the ith couple sits together. Then T = I1 + + In .
Moreover,
2
22 (2n 3)!
4
EIi =
, E(Ii Ij ) =
=
,
2n 1
(2n 1)!
(2n 1)(2n 2)
for any i and j 6= i. Thus
2n
2n 1
ET =
and
4
4n
=
,
(2n 1)(2n 2)
2n 1
E(T 2 ) = ET + n(n 1)
so
Var(T ) =
4n2
4n(n 1)
4n
=
.
2
2n 1 (2n 1)
(2n 1)2
2. Let Ii indicate the event that bird i is not seen by any other bird. Then EIi is
i = n and 41 otherwise. It follows that
EN = 1 +
1
2
if i = 1 or
n2
n+2
=
.
4
4
Furthermore Ii and Ij are independent if |i j| 3 (two birds that have two or more birds
between them are observed independently). Thus Cov(Ii , Ij ) = 0 if |i j| 3. As Ii and Ij are
indicators, Cov(Ii , Ij ) 1 for any i and j. For the same reason Var(Ii ) 1. Therefore
X
X
Var(Ii ) +
Cov(Ii , Ij ) n + 4n = 5n.
Var(N ) =
i6=j
Clearly, if M =
c = 14 .
1
nN,
then EM =
1
n EN
1
4
and Var(M ) =
1
n2 Var(N )
0. It follows that
3. Let Ni be the number of coupons needed to get i different coupons after having i 1 different
ones. Then N = N1 + . . . + Nn , and Ni are independent Geometric with success probability
118
9 CONVERGENCE IN PROBABILITY
ni+1
n .
So
ENi =
n
,
ni+1
Var(Ni ) =
n(i 1)
,
(n i + 1)2
and therefore
1
1
EN = n 1 + + . . . +
,
2
n
n
2
X
n(i 1)
1
1
2
2
Var(N ) =
< 2n2 .
n
+
.
.
.
+
n
1
+
(n i + 1)2
22
n2
6
i=1
If an = n log2 n, then
1
EN 1,
an
as n , so that
1
EN 0,
a2n
1
N 1
an
in probability.
4. Let Ii be the indicator of the event that the ith game is played. Then EI1 = EI2 = EI3 =
EI4 = 1,
EI5 = 1 p4 (1 p)4 ,
EI6 = 1 p5 5p4 (1 p) 5p(1 p)4 (1 p)5 ,
6 3
EI7 =
p (1 p)3 .
3
Add the seven expectations to get EN . Now to compute E(N 2 ) we use the fact that, if i > j,
Ii Ij = Ii , so that E(Ii Ij ) = EIi . So
EN 2 =
X
i
EIi + 2
X
i>j
E(Ii Ij ) =
X
i
EIi + 2
(i 1)EIi =
7
X
(2i 1)EIi ,
i=1
and the final result can be obtained by plugging in EIi and finally by the standard formula
Var(N ) = E(N 2 ) (EN )2 .
5. Imagine the balls ordered in a row, and the ordering specifies the sequence in which they are
pulled. Let Ii be the indicator of the event that the ith red ball is pulled before any black ones.
1
Then EIi = m+1
, simply the probability that in a random ordering of the ith red balls and all
n
.
m black ones, the red comes first. As X = I1 + . . . + In , EX = m+1
Now let Ji be the indicator of the event that the ith red ball is pulled between the first and
the second black one. Then EJi is the probability that the red ball is second in the ordering of
m + 1 balls as above, so EJi = EIi , and EY = EX.