Beruflich Dokumente
Kultur Dokumente
Chapter 4
Kun(Quinn) Zhao
Department of Industrial and Systems Engineering
University of Florida
Adapted from slides by Dr. Ruiwei Jiang and originally created by Dr. Jean-Philippe P. Richard
1 / 81
Stochastic processes
Stochastic process: a sequence of random variables
X = {X0 , X1 , X2 , . . . } = {Xt : t T }
Time parameter: the index t
State of the process at time t: Xt
E.g.:
Stock price
Inventory at the end of the day
Daily traffic of a website
Queue length
2 / 81
Stochastic processes
Types:
Discrete time stochastic process
- T = {0, 1, . . . }, discrete (countable)
Continuous time stochastic process.
- T = R+ , continuous
E.g.:
Stock price
Inventory at the end of the day
Daily traffic of a website
Queue length
3 / 81
Stochastic processes
4 / 81
Stochastic processes
5 / 81
Stochastic processes
Example: Gainesville weather
Is this a stochastic process?
Yes - the process can be described by a series of
RVs: whether it rains or not.
6 / 81
Stochastic processes
7 / 81
Stochastic processes
Example: Quinns mood
Is this a stochastic process?
Yes - the process can be described by a series of
random variables describing her mood
8 / 81
Stochastic processes
Example: credit evaluation
The balance of a customers bill is in one of the four
types: fully paid (type 0), 1 - 30 days in arrears (type
1), 31 - 60 days in arrears (type 2), bad debt (type 3).
The accounts are checked monthly to determine the
state of each customer.
Customers are expected to pay their bills within 30
days. Sometimes, they pay portions of the bills
The balance is within 30 days in arrears: the
customer is moved to type 1.
The balance is between 31 and 60 days in arrears,
the customer is moved to type 2.
More than 60 days in arrears: bad-debt (type 3), the
bills are sent to a collection agency.
9 / 81
Stochastic processes
10 / 81
Stochastic processes
Example: machine downtime
A shop has two identical machines that are operated
continuously except when they break down.
The top-priority assignment for a full-time
maintenance person is to repair them whenever
needed
The time required to repair a machine has an
exponential distribution with a mean of half a day.
Once the repair of a machine is completed, the time
until the next breakdown of that machine has an
exponential distribution with a mean of two days.
These distributions are independent.
11 / 81
Stochastic processes
Example: machine downtime
Is this a stochastic process?
Yes, RVs describing the # of machines functioning
Stochastic processes
13 / 81
Stochastic processes
14 / 81
Stochastic processes
Consider a stochastic process {Xt , t T }
If Xt s are independent, the stochastic process is very
easy to study
However, this implies that the past of the system
does not influence the current state of the system
There is little use in practice
Therefore,
Stochastic processes are useful when the state of the
system Xt at time t depends on the previous states
Xs for s < t
The kind of dependence on the past: simple to study
/ useful in practice
15 / 81
Stochastic processes
Markovian property
A discrete time stochastic process {Xt , t T } is said to
have the Markovian property if:
16 / 81
Stochastic processes
Markov chain
A discrete time stochastic process {Xt , t T } is said be a
Markov chain (MC) if it has the Markovian property.
Recall: Gainesville weather
What are the random variables Xt in this example?
0, if day t is dry,
Xt =
1, if day t is rainy.
Is this stochastic process a Markov chain?
The process is discrete time.
The process has the Markovian property - tomorrows
weather only depends on todays weather
17 / 81
Markov chain
Transition probabilities
Given a MC, we refer to the conditional probabilities
P(Xt+1 = j|Xt = i)
as transition probabilities.
Stationary
The transition probabilities are stationary if
P(Xt+1 = j|Xt = i) = P(X1 = j|X0 = i)
for all i, j S and all t = 0, 1, . . .
18 / 81
Markov chain
19 / 81
Markov chain
Transition matrix
The transition matrix of a MC with stationary
transition probabilities as a matrix P
Pij is the transition probability from state i to j,
i, j S
Properties of the entries of a transition matrix P:
Pij 0 for all i, j S.
P
jS Pij = 1 for all i S.
20 / 81
Markov chain
Gainesville weather example
P(Xt+1 = 0|Xt = 0) = 0.8, P(Xt+1 = 0|Xt = 1) = 0.6,
P(Xt+1 = 1|Xt = 0) = 0.2, P(Xt+1 = 1|Xt = 1) = 0.4.
21 / 81
Markov chain
Transition diagram
Given a MC with stationary transition probabilities,
transition diagram is the graph obtained:
A node for every state of the chain
An arc between every pair of nodes (i, j) with Pij > 0
Each arc labeled with the transition probability Pij ,
i, j S.
The transition diagram for Gainesville weather
0.6
0.8
0
0.2
0.4
22 / 81
Markov chain
Mood example:
Quinn is either cheerful (C), so-so (S), or down (D) on
any given day
If she is cheerful today, then she will be C, S or D
tomorrow with probabilities 0.5, 0.4, 0.1, respectively
If she is so-so today, then she will be C, S or D
tomorrow with probabilities 0.3, 0.4, 0.3.
If she is feeling down today, then she will be C, S or D
tomorrow with probabilities 0.2, 0.3, 0.5.
Question: What are the transition probabilities and
diagram of the MC associated with this problem?
23 / 81
Markov chain
Mood example:
First, define the states:
C, if she is cheerful
S, if she is so-so
Xt =
D, if she is down
Obtain the transition matrix from the problem
description
24 / 81
Markov chain
Mood example:
Draw the transition diagram
0.2
0.4
0.3
0.5
0.4
0.3
0.3
0.5
0.1
25 / 81
Markov chain
Machine downtime example:
A shop has two identical machines that are operated
continuously except when they break down.
Suppose that if the machine is up on day n, it is up on
the (n + 1)st day with probability p1 , independent of
the past.
On the other hand, it it is down on the nth day, it stays
down on the (n + 1)st day with probability p2 , also
independent of the past.
Can this process be represented by a MC? If so what
are its transition probabilities and diagram?
26 / 81
Markov chain
Machine downtime example:
First, let Xn be the number of machines that are up on
day n. Then {Xn , n 0} is a MC with state space
S = {0, 1, 2}.
Obtain the transition matrix
p22
2p2 (1 p2 )
(1 p2 )2
P=
p2 (1 p1 ) p1 p2 + (1 p2 )(1 p1 ) p1 (1 p2 ) .
(1 p1 )2
2p1 (1 p1 )
p12
27 / 81
Markov chain
Repair shop example:
Jasons electronic repair shop stocks a particular
model of cell phone screen that can be ordered
weekly.
The demand for this screen is random. It is 0 with
prob. 0.368, 1 with prob. 0.368, 2 with prob. 0.184, 3
with prob. 0.060 and 4 with prob. 0.020.
At the end of the week, Jason places an order that is
delivered in time for next Monday
Jasons current policy is to order 3 screens if
inventory is empty and order nothing otherwise.
Question: can this process be represented by a MC? If so
what are its transition probabilities and diagram?
28 / 81
Markov chain
Repair shop example:
Computing transition probabilities P(Xt+1 = j|Xt = i):
P(Xt+1 = 0|Xt = 0) = 0.06 + 0.02 = 0.08
He will re-order and get 3 screens in inventory.
The demand in week t + 1 is greater or equal to 3
P(Xt+1 = 2|Xt = 1) = 0
Will not re-order, so inventory will not increase
29 / 81
Markov chain
0.080
0.632
P=
0.264
0.080
30 / 81
Markov chain
Repair shop example:
Draw the transition diagram
0.08
0.264
0.632
0.08
0.368
0.184
0.368
0.368
0.368
0.368
0.184
0.368
0.368
31 / 81
Markov chain
Weather example 2:
The weather in Gainesville changes quickly
However, the chances of being dry (no rain) tomorrow
depends on weather conditions of the last two days
In particular, the probability of being dry tomorrow is
0.8 if it was dry today and yesterday,
0.7 if it was dry today and wet yesterday,
0.6 if it was wet today and dry yesterday,
0.5 it it was wet today and yesterday.
Markov chain
Weather example 2:
If we let the state at day n depend only on whether or not
it is raining at day n, then is this model a MC? No.
Transform the process into a MC
Define the state as
State DD if it was dry both today and yesterday.
State WD if it was dry today and wet yesterday.
State DW if it was dry yesterday and wet today.
State WW if it was wet both today and yesterday.
We recapture the Markovian property.
Transition matrix:
0.8 0 0.2 0
0.7 0 0.3 0
P=
0 0.6 0 0.4 .
0 0.5 0 0.5
33 / 81
Markov chain
Last time
Stochastic process
State of the process, time parameter, state space
Markovian property
Markov Chain
Discrete/continuous time MC
Transform a process to a MC
Transition matrix/diagram
34 / 81
Markov chain
Gamblers ruin
A gambler is playing a series of games.
In each game, independent of everything else, he
either wins $1 with probability p or loses $1 with
probability 1 p.
He quits playing either when he goes broke or he
attains a fortune of $N.
Question: Can this process be represented by a MC? If
so, what are its transition probabilities and diagram?
35 / 81
Markov chain
Gamblers ruin
Define
Xt = n, if n dollars on hand after playing t games.
Obtain the transition matrix P
Pn,n+1 = p, Pn,n1 = 1 p, n = 1, 2, . . . , N 1.
P00 = PNN = 1.
36 / 81
Markov chain
Gamblers ruin
Transition matrix:
1
1 p
.
.
.
P=
0
.
..
0
0
0
..
.
0
..
.
0
p
..
..
.
.
0
..
..
.
.
0 0
0
0
..
.
0
0
..
.
0
0
..
..
.
.
p
..
..
.
.
1p 0
..
..
.
.
0
0 0
0
0
..
.
.
0
..
.
1
37 / 81
Markov chain
Gamblers ruin
Transition diagram:
1
1p
1p
2
p
1p
1p
1p
n1
p
1p
1p
n+1
p
38 / 81
Markov chain
Take Gainesville weather example (1: raining, 0: dry)
P(Xt+1 = 0|Xt = 0) = 0.8, P(Xt+1 = 0|Xt = 1) = 0.6,
P(Xt+1 = 1|Xt = 0) = 0.2, P(Xt+1 = 1|Xt = 1) = 0.4.
Given today is rainy
what is the probability of being wet the day after
tomorrow? p = 0.4 0.4 + 0.6 0.2 = 0.28
what is the probability of being dry the day after
tomorrow? p = 0.4 0.6 + 0.6 0.8 = 0.72
what is the probability that it will be rainy in three
days? p = 0.72 0.2 + 0.28 0.4 = 0.256
Markov chain
Pij
= Pr (Xn = j|X0 = i)
40 / 81
Markov chain
N-step transition probabilities
Observation:
It is clear that Pij1 = Pij for all i, j S.
When n = 2, we have:
(2)
Pij
Pik Pkj
k =0
When n = 3, we have:
(3)
Pij
(2)
Pi0 P0j
(2)
Pi1 P1j
+ =
(2)
Pik Pkj
k =0
41 / 81
Markov chain
42 / 81
Markov chain
Chapman-Kolmogorov equation
Proof:
(n)
Pij
= Pr (Xn = j|X0 = i) =
Pr (Xn = j, Xm = k |X0 = i)
k S
k S
k S
k S
(nm)
Pkj
(m)
Pik .
k S
43 / 81
Markov chain
Corollary:
If P represents the transition matrix of a MC with
stationary transition probabilities and P (n) represents the
n-step transition matrix of this MC, then
P (n) = P n , n = 1, 2, . . .
i.e., the n-step transition matrix can be computed using
matrix multiplication.
44 / 81
Markov chain
45 / 81
Markov chain
Gainesville weather example version 3:
Given the following conditions, (1: raining, 0: dry)
P(Xt+1 = 0|Xt = 0) = 0.8, P(Xt+1 = 0|Xt = 1) = 0.6,
P(Xt+1 = 1|Xt = 0) = 0.2, P(Xt+1 = 1|Xt = 1) = 0.4.
Assume that there is a 50% chance that it will be rainy
today. What is the probability that it will be dry 2 days from
now?
Solution: first obtain P (2)
0.800 0.200
0.760 0.240
(2)
2
P=
P =P =
.
0.600 0.400
0.720 0.280
46 / 81
Markov chain
Gainesville weather example version 3:
Solution: first obtain P (2)
0.800 0.200
0.760 0.240
(2)
2
P=
P =P =
.
0.600 0.400
0.720 0.280
Then use conditioning and total probability
Pr (dry in 2 days) =Pr (dry today)Pr (dry in 2 days | dry today)
+Pr (rain today)Pr (dry in 2 days | rain today)
(2)
(2)
47 / 81
Markov chain
Find PMF for Xn
In a MC with stationary transition probabilities,
X (n)
Pr (Xn = j) =
Pij Pr (X0 = i)
iS
iS
(n)
Pij
Pr (X0 = i).
iS
Markov chain
Example:
Let {Xn , n 0} be a DTMC with state space
S = {1, 2, 3, 4} and the stationary transition probabilities,
P=
0.5 0 0.5 0 .
0.6 0.2 0.1 0.1
The initial PMF is a = 0.25 0.25 0.25 0.25 .
Pr (X3 = 4, X2 = 1, X1 = 3, X0 = 1)
= Pr (X0 = 1)p13 p31 p14
= (0.25)(0.3)(0.5)(0.4) = 0.015.
49 / 81
Markov chain
Example:
Find the PMF vector
of X4 .
a(4) = aP 4 = 0.3411 0.1354 0.3253 0.1983
Pr (X3 = 4, X2 = 1, X1 = 3)
P4
=
Pr (X3 = 4, X2 = 1, X1 = 3|X0 = i) Pr (X0 = i)
Pi=1
P4
4
=
i=1 ai pi3 p31 p14 = (0.20)
i=1 ai pi3 = 0.06.
50 / 81
Markov chain
Repair shop example cont.:
It is the beginning of the week 0 and Jason has 3
screens in stock. He expects to sell at least two
screens this week, because the game weekend is
coming.
In fact, he estimates that there is a 75% chance that
his inventory will be empty at the end of week 0 and a
25% chance that his inventory will have only 1 screen
left.
What is the probability that he will see 0, 1, 2 or 3
screens in inventory at the end of week 2?
51 / 81
Markov chain
Repair shop example cont.:
Solution:
Given that Pr (X0 = 0) = 0.75 and Pr (X0 = 1) = 0.25.
We want ai = Pr (X2 = i), for i = 0, 1, 2, 3.
Then a = 0.75 0.25 0 0 P 2 , where
0.080 0.184
0.632 0.368
P=
0.264 0.368
0.080 0.184
Thus, a = 0.2577 0.2771
0.368 0.368
0
0
0.368
0
0.368 0.368
0.2834 0.1818 .
52 / 81
Markov chain
Last time:
N-step transition probability
Chapman-Kolmogorov equation
X (m) (nm)
(n)
Pij =
Pik Pkj
k S
53 / 81
Markov chain
54 / 81
Markov chain
Weather example
0.800 0.200
P=
0.600 0.400
Then
P
(2)
P (5)
0.800 0.200 0.800 0.200
0.760 0.240
=
=
,
0.600 0.400 0.600 0.400
0.720 0.280
0.752 0.248
0.750 0.250
(3)
(4)
P =
,P =
,
0.744 0.256
0.749 0.251
0.750 0.250 0.800 0.200
0.750 0.250
=
=
.
0.749 0.251 0.600 0.400
0.750 0.250
55 / 81
Markov chain
Weather example: what is the probability that it will be
rainy 5 days from now?
Assume that there is a 50% chance that it will be
rainy today.
(5)
T
0.5 0.5 P1 = 0.5 0.5 0.75 0.75
= 0.75.
Assume that there is a 75% chance that it will be
rainy today.
(5)
T
0.75 0.25 P1 = 0.75 0.25 0.75 0.75 =
0.75.
Assume that we know it will be rainy today.
(5)
T
1 0 P1 = 1 0 0.75 0.75
= 0.75.
56 / 81
Markov chain
Weather example:
As n grows, the probability that the chain is in any
given state becomes independent of the state it
started in.
The elements inside any column of P (n) become
identical to each other.
57 / 81
Markov chain
Gamblers ruin
1
1p
1p
2
p
1p
1p
1p
n1
p
1p
1p
n+1
p
58 / 81
Markov chain
What happens in the cycle example?
1
2
1
= = 0
1
= = 1
0
1
= = 0
0
1 0
0 1 ,
0 0
0 1
0 0 ,
1 0
0 0
1 0 .
0 1
Markov chain
Accessible
Given a MC, state j is accessible from state i if
(n)
Pij > 0 for some n 0.
Communicate
If state i is accessible form j and state j is accessible
from state i
then states i and j are said to communicate, denoted
i j
Irreducible:
If all states in a MC communicate, then the MC is
said to be irreducible.
60 / 81
Markov chain
Remarks:
Any state communicates with itself.
Pii0 = Pr {X0 = i|X0 = i} = 1
If state i communicates with state j then state j
communicates with state i.
If state i communicates with state j and state j
communicates with state k then state i
communicates with state k .
61 / 81
Markov chain
Transient:
In a MC, a state i is said to be transient if there exists
a state j that is accessible from state i but state i is
not accessible from state j.
Recurrent:
In a MC, a state i is said to be if and only if it is not
transient.
Absorbing:
In a MC, a state is said to be absorbing if, upon
entering this state, the process will never leave it.
A state i is said to be absorbing if and only if Pii = 1.
62 / 81
Markov chain
Period of state:
(n)
Markov chain
64 / 81
Markov chain
Example The MC with transition probabilities listed below
is in state 3
P{the process is in state 3 after n trials}: 0.2n
The expected number of trials up to and include the
trial on which the process leaves state 3:
X
E[N] =
n 0.2n1 0.8 = 5/4
n
Markov chain
Steady-state probabilities
Intuitively
If there is a probability 1 to be in state 1 and 2 to be
in state 2
Then there will still be a probability 1 to be in state 1
and 2 to be in state 2 in the next step.
Recall weather example
0.8 0.2
0.75 0.25
(5)
P=
P =
.
0.6 0.4
0.75 0.25
66 / 81
Markov chain
Steady-state probabilities
Recall weather example
0.8 0.2
P=
0.6 0.4
0.81 + 0.62 = 1
0.21 0.62 = 0
0.21 + 0.42 = 2
1 +
2 = 1
1 +
2 = 1
We obtain 1 =
3
4
and 2 = 41 .
67 / 81
Markov chain
Steady-state probabilities theorem
For any finite state ergodic Markov Chain, limn Pijn
exists and is independent of i. Furthermore,
lim Pijn = j > 0
M
X
j =
(
i Pij , j = 0, . . . , M,
= P,
i=0
M
X
eT = 1.
=
1.
j=0
68 / 81
Markov chain
Steady-state probabilities
The j can be interpreted as:
Stationary probability: If the initial probability of being
in state j is j , i.e., Pr (X0 = j) = j for all j, then the
probability of finding the process in state j at time
n = 1, 2, . . . is also given by j , i.e., Pr (Xn = j) = j .
The probability to be in state j at time n when n is
sufficiently large.
The fraction of time the system spends in state j.
69 / 81
Markov chain
Recall the cycle example, the transition matrix is:
0 1 0
P = 0 0 1 .
1 0 0
0 1 0
0 1 2 = 0 1 2 0 0 1 .
1 0 0
Obtain 1 = 2 = 3 = 1/3.
This still represents the fraction of time spent in each
state.
70 / 81
Markov chain
Recall the cycle example,
1
3n 2
3n2
X
n1
P (k ) =
k =1
3n1
X
1
P (k ) =
3n 1
k =1
3n
1 X (k )
P
=
3n
k =1
3n2
n1
3n2
n
3n2
n1
3n1
n1
3n1
n
3n1
1 1
3
3
1 1
3
3
1
1
3
3
n
3n2
n1
3n2
n1
3n2
n
3n1
n1
3n1
n1
3n1
1
3
1
.
3
1
3
n1
3n2
n
3n2 ,
n1
3n2
n1
3n1
n
3n1 ,
n1
3n1
71 / 81
Markov chain
72 / 81
Markov chain
Steady-state probabilities theorem (version II)
For any finite state irreducible Markov Chain,
P
(k )
limn n1 nk=1 Pij exists and is independent of i.
Furthermore,
n
1 X (k )
lim
Pij = j > 0
n n
k =1
M
X
(
i Pij , j = 0, . . . , M,
j =
= P,
i=0
M
X
eT = 1.
=
1.
j=0
73 / 81
Markov chain
Recall the Repair shop example, the transition matrix is:
0.080
0.632
P=
0.264
0.080
0 1 2 3 = 0 1 2 3 P
Then, we have
0 = 0.2856, 1 = 0.2848, 2 = 0.2631, 3 = 0.1663.
74 / 81
Markov chain
0 if xt = 0,
2 if xt = 1,
C(xt ) =
8 if xt = 2,
18 if xt = 3.
Compute the expected inventory costs of the system per
unit of time?
75 / 81
Markov chain
Recall the Repair shop example
We have the steady-state probabilities
0 = 0.2856, 1 = 0.2848, 2 = 0.2631, 3 = 0.1663.
Inventory cost
2
C(xt ) =
8
18
if xt
if xt
if xt
if xt
= 0,
= 1,
= 2,
= 3.
The long run expected cost per unit of time will be:
C(0)0 + C(1)1 + C(2)2 + C(3)3 = 5.662.
76 / 81
Markov chain
j=0
77 / 81
Markov chain
Dinner example
Every time that the team wins a game, it wins its next
game with probability 0.8; every time it loses a game, it
wins its next game with probability 0.3.
If the team wins a game, then it has dinner together with
probability 0.7, whereas if the team loses then it has
dinner with probability 0.2. What proportions of games
result in a team dinner?
78 / 81
Markov chain
Dinner example
Let w be the proportion of games the team wins then
w = 0.8w + 0.3(1 w ) w = 3/5
The proportion of games that result in a team dinner is
0.7
3
3
+ 0.2 (1 ) = 0.5
5
5
79 / 81
Markov chain
Taxi example
A taxi driver provides service in two zones of a city. Fares
picked up in zone A will have destinations in zone A with
probability 0.6 or in zone B with probability 0.4. Fares
picked up in zone B will have destinations in zone A with
probability 0.3 or in zone B with probability 0.7. The
drivers expected profit for a trip entirely in zone A is 6; for
a trip entirely in zone B is 8; and for a trip that involves
both zones is 12. Find the taxi drivers average profit per
trip.
80 / 81
Markov chain
Taxi example
Let the state be the pickup locations,
Pr (A, A) = 0.6, Pr (B, A) = 0.3
Pr (A, B) = 0.4, Pr (B, B) = 0.7
The long run proportions of pickup locations can be
obtained by
A = 0.6A + 0.3B = 0.6A + 0.3(1 A )
We get A = 3/7, and B = 4/7.
Now let X be the profit in a trip,
E[X ] =
=
3
4
E[X |A] + E[X |B]
7
7
3
4
(0.6 6 + 0.4 12) + (0.3 12 + 0.7 8)
7
7
81 / 81