Sie sind auf Seite 1von 8

Math171: Stochastic Processes

Problem Set 1

Michael Powell
Department of Economics, UCLA

April 9th, 2005

1 Exercise 1.9.1
1.1 Question
A fair coin is tossed repeatedly with results Y0 ; Y1 ; ::: that are 0 or 1 with probability 12 each. For n 1 let
Xn = Yn + Yn 1 be the number of 1’s in the (n 1)th and nth tosses. Is Xn a Markov chain?

1.2 Answer
In short, the answer is no. A Markov chain is a stochastic process with the property that

P (Xn+1 = jjX0 = i0 ; :::; Xn = in = i) = p (i; j) :

Suppose Xn = 1 and Xn 1 = 2: Then we have that

P (Xn+1 = 2jXn = 1 \ Xn 1 = 2)
= P (Yn+1 = 1 \ Yn = 1j ((Yn = 1 \ Yn 1 = 0) [ (Yn = 0 \ Yn 1 = 1)) \ Yn 1 = 1 \ Yn 2 = 1)
= 0

but

P (Xn+1 = 2jXn = 1)
= P (Yn+1 = 1 \ Yn = 1j (Yn = 1 \ Yn 1 = 0) [ (Yn = 0 \ Yn 1 = 1))
P (Yn+1 = 1 \ Yn = 1 \ Yn 1 = 0) + P (Yn+1 = 1 \ Yn = 1 \ Yn = 0 \ Yn 1 = 0)
=
P (Yn = 1 \ Yn 1 = 0) + P (Yn = 0 \ Yn 1 = 1)
1
8 1
= 1 =
2
4

That is, P (Xn+1 = 2jXn = 1 \ Xn 1 = 2) 6= P (Xn+1 = 2jXn = 1) in general. i.e. fXn g is not a Markov
process.

2 Exercise 1.9.2
2.1 Question

Five white balls and …ve black balls are distributed in two urns in such a way that each urn contains …ve
balls. At each step we draw one ball from each urn and exchange them. Let Xn be the number of white
balls in the left urn at time n. Compute the transition probability for Xn .

1
2.2 Answer

Given 1 Xn = k 4, the probability of decreasing the number of white balls in the left urn is equal to the
probability of drawing a white ball from the left urn and a black ball from the right urn. That is,
k k k2
p (Xn+1 = k 1jXn = k) = =
5 5 25
The only way that the number of white balls in the left urn can remain constant is if the same colored
ball is drawn from each urn. That is,
k 5 k 5 k k 2k (5 k)
p (Xn+1 = kjXn = k) = + =
5 5 5 5 25
Finally, in order to increase the number of white balls in the left urn, we must draw a black ball from
the left urn and a white ball from the right urn. That is,
2
5 k 5 k (5 k)
p (Xn+1 = k + 1jXn = k) = =
5 5 25
If Xn = 0, we have that
p (Xn+1 = 1jXn = 0) = 1
If Xn = 5, we have that

P (Xn+1 = 4jXn = 5) = 1

3 Exercise 1.9.3
3.1 Question

Suppose that the probability it rains today is 0.3 if neither of the last two days was rainy, but 0.6 if at least
one of the last two days was rainy. Let the weather on day n, Wn , be R for rain, or S for sun. Wn is
not a Markov chain, but the weather for the last two days Xn = (Wn 1 ; Wn ) is a Markov chain with four
states fRR; RS; SR; SSg. (a) Compute its transition probability. (b) Compute the two-step transition
probability. (c) What is the probability it will rain on Wednesday given that it did not rain on Sunday or
Monday?

3.2 Answer to (a)

I will make my calculations explicit only for a couple entries. The rest, I will simply …ll in.
p (RSjRR) = p (Wn 1 = R \ Wn = SjWn 2 = R \ Wn 1 = R)
= p (Wn = SjWn 1 = R \ Wn 2 = R) = 1 0:6 = 0:4
p (SSjRS) = p (Wn 1 = S \ Wn = SjWn 2 = R \ Wn 1 = S)
= p (Wn = SjWn 1 = S \ Wn 2 = R) = 1 0:6 = 0:4
p (SRjRR) = p (Wn 1 = S \ Wn = RjWn 2 = R \ Wn 1 = R)
= 0 since Wn 1 = S \ Wn 1 = R = ;
RR RS SR SS
RR 0.6 0.4 0 0
P = RS 0 0 0.6 0.4
SR 0.6 0.4 0 0
SS 0 0 0.3 0.7

2
3.3 Answer to (b)
RR RS SR SS
RR 0.36 0.24 0.24 0.16
P 2 = RS 0.36 0.24 0.12 0.28
SR 0.36 0.24 0.24 0.16
SS 0.18 0.12 0.21 0.49

3.4 Answer to (c)

p2 ((RR [ SR) \ SS)


p2 (RR [ SRjSS) =
p2 (SS)
p (RR \ SS) + p2 (SR \ SS)
2
=
p2 (SS)
p (RR \ SS) p2 (SR \ SS)
2
= +
p2 (SS) p2 (SS)
= p2 (RRjSS) + p2 (SRjSS)
= 0:18 + 0:21 = 0:39

4 Exercise 1.9.4
4.1 Question

Consider a gambler’s ruin chain with N = 4. That is, if 1 i 3, p (i; i + 1) = 0:4 and p (i; i 1) = 0:6,
but the endpoints are absorbing states; p (0; 0) = 1, and p (4; 4) = 1. Compute p3 (1; 4) and p3 (1; 0).

4.2 Answer

Given the transition probabilities, we can construct the transition matrix:


0 1 2 3 4
0 1 0 0 0 0
1 0.6 0 0.4 0 0
P =
2 0 0.6 0 0.4 0
3 0 0 0.6 0 0.4
4 0 0 0 0 1

The three-step transition probabilities are just the corresponding entries of the cube of the transition

matrix
0 1 2 3 4
0 1 0 0 0 0
3 1 0.744 0 0.192 0 0.64
P =
2 0.36 0.288 0 0.192 0.16
3 0.216 0 0.288 0 0.496
4 0 0 0 0 1
Therefore, we have that p3 (1; 4) = 0:64 and p3 (1; 0) = 0:744.

3
5 Exercise 1.9.5
5.1 Question

A taxicab driver moves between the airport A and two hotels B and C according to the following rules. If he
is at the airport, he will be at one of the two hotels next with equal probability. If at a hotel then he returns
to the airport with probability 34 and goes to the other hotel with probability 14 . (a) Find the transition
matrix for the chain. (b) Suppose the driver begins at the airport at time 0. Find the probability for each
of his three possible locations at time 2 and the probability he is at hotel B at time 3.

5.2 Answer to (a)

A B C
A 0 0.5 0.5
P =
B 0.75 0 0.25
C 0.75 0.25 0

5.3 Answer to (b)

It is …rst necessary to …nd the two-step and three-step transition matrices for this Markov chain:

0 1
0:75 0:125 0:125
P 2 = @ 0:1875 0:4375 0:375 A
0 0:1875 0:375 0:4375 1
0:1875 0:40625 0:40625
P 3 = @ 0:609375 0:1875 0:203125 A
0:609375 0:203125 0:1875

Given the initial distribution of 0 = 1 0 0 , we have that

2
p2 (A) p2 (B) p2 (C) = 0P
0 1
0:75 0:125 0:125
= 1 0 0 @ 0:1875 0:4375 0:375 A
0:1875 0:375 0:4375
= 0:75 0:125 0:125

In addition, we have that

3
p3 (A) p3 (B) p3 (C) = 0P
0 1
0:1875 0:40625 0:40625
= 1 0 0 @ 0:609375 0:1875 0:203125 A
0:609375 0:203125 0:1875
= 0:1875 0:40625 0:40625

In particular, p3 (B) = 0:40625

4
6 Additional Problem 1
6.1 Question

Consider two urns, A and B, containing a total of N balls. Let Xn be the number of balls in urn A at time
k k
n. If Xn = k, an urn is chosen with probabilities N and 1 N respectively, and a ball is chosen uniformly
from all N balls. The chosen ball is placed in the chosen urn. Find the transition probabilities for this
Markov chain.

6.2 Answer

First, I shall consider the case where 1 Xn = k N 1


The probability of the number of balls in urn A decreasing is the probability of a ball being chosen from
urn A and being placed in urn B. That is,

k N k
p (Xn+1 = k 1jXn = k) =
N N
The probability of the number of balls in urn A remaining constant is the probability that a ball that is
chosen from either urn A or urn B is placed back in the urn from which it was drawn. That is,

k k N k N k
p (Xn+1 = kjXn = k) = +
N N N N
The probability of the number of balls in urn A increasing is the probability of a ball being chosen from
urn B and being placed urn A. That is,

N k k
p (Xn+1 = k + 1jXn = 1) =
N N
All other transition probabilities are zero, since it is never possible to change the location of more than
one ball in a given round.
The cases where Xn = 0 and Xn = N are trivial since if there are no balls in urn A, the probability
that the ball which is drawn will be placed in urn A is zero. Similarly, if all the balls are in urn A, the
probability that the ball which is drawn will be placed in urn A is one. Formally,

0 0 N 0
P (Xn+1 = 0jXn = 0) = + =0
N N N N
N N 0 N
P (Xn+1 = N jXn = N ) = + =1
N N N N

7 Additional Problem 2
7.1 Question

Recall that a stochastic matrix is one that has nonnegative entries and row sums equal to 1. Every such
matrix corresponds to a Markov chain. Not every stochastic matrix can be the two step transition matrix
for a Markov chain. Show that a 2 2 stochastic matrix is the two step transition matrix for a Markov
chain if and only if the sum of its diagonal entries is at least 1.

5
7.2 Answer

1 p p
Let P = , p; q 2 [0; 1]. Then,
q 1 q

2
(1 p) + pq p (1 p) + p (1 q)
P2 = 2
q (1 p) + q (1 q) (1 q) + pq
2
(1 p) + pq p (2 p q)
= 2
q (2 p q) (1 q) + pq

a a 1
Let A be a 2 2 stochastic matrix. That is, A = where a; b 2 [0; 1].
b 1 b
2
Suppose A = P , that is, an arbitrary stochastic matrix is a two step transition matrix for a Markov
chain. Then we have

2
1 a = (1 p) + pq
a = p (2 p q) (1)
b = q (2 p q) (2)
2
1 b = (1 q) + pq

The question is when such a p and q exist.

a+b = p (2 p q) + q (2 p q)
= 2p p2 pq + 2q pq q 2
= p2 + 2p 2pq + 2q q 2
= p ( 1) + p (2 2q) + 2q q 2
2

0 = p2 ( 1) + p (2 2q) + 2q q 2 a b

Using the quadratic formula, we have:

q
2
2q 2 (2 2q) 4 ( 1) (2q q2 a b)
p =
2
p
= 1 q 4 8q + 4q 2 + 8q 4q 2 4a 4b
p
= 1 q 4 4a 4b
p
p+q = 1 2 1 a b

which is real only when 1 a b 0 ) a + b 1 ) 2 a b 1. That is, the diagonal elements sum
up to at least one.
Further, we know that pq = ab from (1) and (2). Therefore, when 2 a b 1, we have that the elements
of P are uniquely determined by

a p
p = 1 2 1 a b
a+b
b p
q = 1 2 1 a b
a+b

using the choice of that makes sense.


Therefore, any 2 2 matrix with diagonal entries which sum up to 1 or more can be a two step transition
matrix for a Markov chain.

6
8 Additional Problem 3
8.1 Question

Suppose X1 ; X2 ; ::: are random variables taking the values 0 and 1, and satisfying

P (Xn = 1jX1 = i1 ; X2 = i2 ; :::; Xn 1 = in 1)

for all n 1 and all choices of i1 ; :::; in 1 2 f0; 1g. Show that if > 0, then
(a) P(Xn = 1 for some n) = 1
and
(b) P(Xn = 1 for in…nitely many n) = 1
(Suggestion: Look at the complementary events.)

8.2 Answer to (a)

Let A = fXn = 1 for some ng. Then Ac = fXn = 0 8ng.


Equivalently, Ac = fX1 = 0g \ fX2 = 0g \ ::: \ fXn = 0g \ :::.
Therefore,

P (A) = 1 P (Ac ) = 1 P (X1 = 0 \ X2 = 0 \ ::: \ Xn = 0 \ :::)


= 1 P (X1 = 0) P (X2 = 0jX1 = 0) P (X3 = 0jX1 = 0; X2 = 0) :::
1 (1 ) (1 ) (1 ) :::
n n
= lim 1 (1 ) =1 lim (1 ) = 1 since 0 < 1
n!1 n!1

8.3 Answer to (b)


De…ne

T1 min ft 2 NjXt = 1g
T2 min ft > T1 jXt = 1g
..
.
Tn min ft > Tn 1 jXt = 1g
..
.

From part (a), we know that P (Xn = 1 for some n) = 1. Therefore, P (T1 < 1) = 1.
Extending the argument from (a) further, using the Strong Markov property, we have that P (Tn < 1) =
1 for all n.

Let

B = fXn = 1 for in…nitely many ng


= f8N 2 N, 9n N 3 Xn = 1g
= fTn < 1 8ng
\1
= fTn < 1g
n=1

7
Then we have:

1
!
\
P (B) = P fTn < 1g
n=1
= P (T1 < 1) P (T2 < 1jT1 < 1) P (T3 < 1jT1 < 1; T2 < 1) :::
1
Y
= 1 by the Strong Markov property
j=1
= 1

Das könnte Ihnen auch gefallen