Beruflich Dokumente
Kultur Dokumente
Chapter 16
Markov Chains - 1
Overview
Stochastic Process
Markov Chains
Chapman-Kolmogorov Equations
State classification
First passage time
Long-run properties
Absorption states
Markov Chains - 2
Markov Chains - 3
Stochastic Processes
Suppose now we take a series of observations of that
random variable.
A stochastic process is an indexed collection of random
variables {Xt}, where t is the index from a given set T.
(The index t often denotes time.)
Examples:
Markov Chains - 4
States
Well consider processes that have a finite number of
possible values for Xt
Call these possible values states
(We may label them 0, 1, 2, , M)
These states will be mutually exclusive and exhaustive
What do those mean?
Mutually exclusive:
Exhaustive:
Markov Chains - 6
Markov Chains - 7
Markov Chains - 8
Inventory Example
A camera store stocks a particular model camera
Orders may be placed on Saturday night and the
cameras will be delivered first thing Monday morning
The store uses an (s, S) policy:
If the number of cameras in inventory is greater than or equal
to s, do not order any cameras
If the number in inventory is less than s, order enough to
bring the supply up to S
Markov Chains - 9
Inventory Example
Markov Chains 10
Inventory Example
Markov Chains 11
Inventory Example
X t+1 =
Markov Chains 12
Markovian Property
A stochastic process {Xt} satisfies the Markovian property if
P(Xt+1=j | X0=k0, X1=k1, , Xt-1=kt-1, Xt=i) = P(Xt+1=j | Xt=i)
for all t = 0, 1, 2, and for every possible state
What does this mean?
Markov Chains 13
Markovian Property
Markov Chains 14
Interpretation:
Markov Chains 15
Markov Chains 16
Markov Chains 17
Markov Chains 18
Monopoly Example
Transition Matrix
p00
p
10
P
...
pM 0
p01
p11
...
pM 1
...
...
...
...
p0 M
...
p( M 1) M
pMM
Markov Chains 20
Inventory
Markov Chains 21
Weather Example
Transition Probabilities
Markov Chains 22
Inventory Example
Transition Probabilities
n = 1, 2,
Max {3 - Dt+1, 0}
if Xt < 1 (Order)
if Xt 1 (Dont order)
Markov Chains 23
Inventory Example
Transition Probabilities
P=
Markov Chains 24
Interpretation:
Markov Chains 25
Inventory Example
n-step Transition Probabilities
p12(3) =
A picture:
Markov Chains 26
Chapman-Kolmogorov Equations
(n)
ij
pik( v ) pkj( n v )
k 0
Markov Chains 27
Chapman-Kolmogorov Equations
Markov Chains 28
Weather Example
n-step Transitions
Two-step transition probability matrix:
P(2) =
Markov Chains 29
Inventory Example
n-step Transitions
Two-step transition probability matrix:
P(2) =
.080
.632
.264
.080
Markov Chains 30
Inventory Example
n-step Transitions
p13(2) = probability that the inventory goes from 1 camera to
3 cameras in two weeks
=
(note: even though p13 = 0)
Question:
Assuming the store starts with 3 cameras, find the
probability there will be 0 cameras in 2 weeks
Markov Chains 31
Markov Chains 32
Inventory Example
Unconditional Probabilities
Markov Chains 33
Steady-State Probabilities
P(8) = P8 =
Markov Chains 34
Steady-State Probabilities
lim pij( n ) j
Markov Chains 35
State Classification
Accessibility
0
0
0 .4 0 .6 0
0
0
0 .5 0 .5 0
P 0
0 0 .3 0 .7 0
0
0 0.5 0.4 0.1
0
0
0 0.8 0.2
Markov Chains 36
State Classification
Accessibility
This is written j i
For the example, which states are accessible from
which other states?
Markov Chains 37
State Classification
Communicability
State Classes
Markov Chains 39
Irreducibility
Markov Chains 40
Markov Chains 41
Markov Chains 42
State i is said to be
Transient if there is a positive probability that the process will
move to state j and never return to state i
(j is accessible from i, but i is not accessible from j)
Recurrent if the process will definitely return to state i
(If state i is not transient, then it must be recurrent)
Absorbing if p ii = 1, i.e. we can never leave that state
(an absorbing state is a recurrent state)
Gamblers ruin:
Transient states:
Recurrent states:
Absorbing states:
Inventory problem
Transient states:
Recurrent states:
Absorbing states:
Markov Chains 44
Periodicity
Markov Chains 45
Periodicity
Examples
1 2
0
3
3
1
1
P
0
2
2
3
1
0
4
4
1
2
1
P 2
0
1
0
0
2
1
0
0
2
2
1
0
3
3
0 1 3
4
4
Markov Chains 46
Markov Chains 47
Steady-State Probabilities
.286
.286
.286
.286
.285
.285
.285
.285
.263
.263
.263
.263
.166
.166
.166
.166
Steady-State Probabilities
j 0
1
M
Markov Chains 49
Steady-State Probabilities
Examples
P 0 . 3 0 .7
0.6 0.4
1 2
0
3
3
P 1
0 1
2
2
0 14 3 4
.080
.632
P
Inventory example
.264
.080
Markov Chains 51
Markov Chains 52
Markov Chains 53
Markov Chains 54
fij(n) =
Markov Chains 55
Week 2?
Week 3?
Markov Chains 56
nf
ij E f
(n)
ij
n 1
(n )
ij
ij 1 pik kj
k 0
k j
Markov Chains 57
Markov Chains 58
Absorbing States
Transient Absorbing
Markov Chains 59
Absorbing States
(I-Q)-1
(I-Q)-1R
Markov Chains 60
0: New Account
1: Payment on account is 1 month overdue
2: Payment on account is 2 months overdue
3: Payment on account is 3 months overdue
4: Account paid in full
5: Account is written off as bad debt
Markov Chains 61
Let
Markov Chains 62
We get
1 .6 .3 .12
(I Q )1 0 1 .5 .2
0 0 1 .4
0 0 0 1
.964
(I Q )1R .940
.880
.700
.036
.060
.120
.300
Markov Chains 63