Sie sind auf Seite 1von 16

.

Markov Chains

Tutorial #5
Ydo Wexler & Dan Geiger
.
Statistical Parameter Estimation
Reminder
The basic paradigm:



MLE / bayesian approach

Input data: series of observations X
1
, X
2
X
t

-We assumed observations were i.i.d (independent identical distributed)
Data set
Model
Parameters:
Heads - P(H) Tails - 1-P(H)
3
Markov Process
Markov Property: The state of the system at time t+1 depends only
on the state of the system at time t
X
1
X
2
X
3
X
4
X
5
| | | | x | X x X x x X | X x X
t t t t t t t t
= = = = =
+ + + + 1 1 1 1 1 1
Pr Pr
Stationary Assumption: Transition probabilities are independent of
time (t)
| |
1
Pr
t t ab
X b | X a p
+
= = =
Bounded memory transition model
4
Weather:
raining today 40% rain tomorrow
60% no rain tomorrow

not raining today 20% rain tomorrow
80% no rain tomorrow
Markov Process
Simple Example
rain
no rain
0.6
0.4
0.8
0.2
Stochastic FSM:
5
Weather:
raining today 40% rain tomorrow
60% no rain tomorrow

not raining today 20% rain tomorrow
80% no rain tomorrow
Markov Process
Simple Example
|
|
.
|

\
|
=
8 . 0 2 . 0
6 . 0 4 . 0
P
Stochastic matrix:
Rows sum up to 1
Double stochastic matrix:
Rows and columns sum up to 1
The transition matrix:
6
Gambler starts with $10
- At each play we have one of the following:
Gambler wins $1 with probability p
Gambler looses $1 with probability 1-p
Game ends when gambler goes broke, or gains a fortune of $100
(Both 0 and 100 are absorbing states)
0 1
2
99
100
p
p p p
1-p
1-p 1-p
1-p
Start
(10$)
Markov Process
Gamblers Example
7
Markov process - described by a stochastic FSM
Markov chain - a random walk on this graph
(distribution over paths)
Edge-weights give us
We can ask more complex questions, like
Markov Process
| |
1
Pr
t t ab
X b | X a p
+
= = =
| | ? Pr
2
= = =
+
b a | X X
t t
0 1
2
99
100
p
p p p
1-p
1-p 1-p
1-p
Start
(10$)
8
Given that a persons last cola purchase was Coke,
there is a 90% chance that his next cola purchase will
also be Coke.
If a persons last cola purchase was Pepsi, there is
an 80% chance that his next cola purchase will also be
Pepsi.
coke pepsi
0.1
0.9
0.8
0.2
Markov Process
Coke vs. Pepsi Example
(

=
8 . 0 2 . 0
1 . 0 9 . 0
P
transition matrix:
9
Given that a person is currently a Pepsi purchaser,
what is the probability that he will purchase Coke two
purchases from now?
Pr[ Pepsi?Coke ] =
Pr[ PepsiCokeCoke ] + Pr[ Pepsi Pepsi Coke ] =
0.2 * 0.9 + 0.8 * 0.2 = 0.34
(

=
(

=
66 . 0 34 . 0
17 . 0 83 . 0
8 . 0 2 . 0
1 . 0 9 . 0
8 . 0 2 . 0
1 . 0 9 . 0
2
P
Markov Process
Coke vs. Pepsi Example (cont)
Pepsi ?
? Coke
(

=
8 . 0 2 . 0
1 . 0 9 . 0
P
10
Given that a person is currently a Coke purchaser,
what is the probability that he will purchase Pepsi
three purchases from now?
Markov Process
Coke vs. Pepsi Example (cont)
(

=
(

=
562 . 0 438 . 0
219 . 0 781 . 0
66 . 0 34 . 0
17 . 0 83 . 0
8 . 0 2 . 0
1 . 0 9 . 0
3
P
11
Assume each person makes one cola purchase per week
Suppose 60% of all people now drink Coke, and 40% drink Pepsi
What fraction of people will be drinking Coke three weeks from now?
Markov Process
Coke vs. Pepsi Example (cont)
(

=
8 . 0 2 . 0
1 . 0 9 . 0
P
(

=
562 . 0 438 . 0
219 . 0 781 . 0
3
P
Pr[X
3
=Coke] = 0.6 * 0.781 + 0.4 * 0.438 = 0.6438

Q
i
- the distribution in week i
Q
0
=(0.6,0.4) - initial distribution
Q
3
= Q
0
* P
3
=(0.6438,0.3562)
12
Simulation:
Markov Process
Coke vs. Pepsi Example (cont)
week - i
P
r
[
X
i

=

C
o
k
e
]

2/3
| | | |
3
1
3
2
3
1
3
2
8 . 0 2 . 0
1 . 0 9 . 0
=
(

stationary distribution
coke pepsi
0.1
0.9
0.8
0.2
13
Hidden Markov Models - HMM
X
1
X
2
X
L-1
X
L
X
i
Hidden states
Observed
data
H
1
H
2
H
L-1
H
L
H
i
14
0.9
fair

loaded

H

H T

T

0.9
0.1
0.1
1/2
1/4
3/4 1/2
Hidden Markov Models - HMM
Coin-Tossing Example
Fair/Loade
d
Head/Tail
X
1
X
2
X
L-1
X
L
X
i
H
1
H
2
H
L-1
H
L
H
i
transition probabilities
emission probabilities
15
Hidden Markov Models - HMM
C-G Islands Example
Regular
DNA
C-G island
C-G islands: Genome regions which are very rich in C and G
A

C

G

T

change

A

C

G

T

(1-P)/4
P/6
q/4
q/4
q/4
q/4 P
P
q
q
q
q P
P
(1-q)/6
(1-q)/3
p/3
p/3
p/6
16
Hidden Markov Models - HMM
C-G Islands Example
C-G /
Regular
{A,C,G,T}
X
1
X
2
X
L-1
X
L
X
i
H
1
H
2
H
L-1
H
L
H
i
To be continued

Das könnte Ihnen auch gefallen