Beruflich Dokumente
Kultur Dokumente
95
96 CHAPTER 3. HIDDEN MARKOV MODELS (HMMS)
start
0.45 0.55
0.7 0.75
0.3
Cold Hot
0.25
0.2 0.3
0.8 0.7
N D
Figure 3.1: Example of an HMM modeling the drinking behavior of a professor at the
University of Pennsylvania.
Also, from each of the states Cold and Hot, we have emis-
sion probabilities of producing the ouput N or D, and
these probabilities also sum to 1.
The vector
! "
0.45
=
0.55
start
0.4 0.6
0.6 0.7
0.4
Cold Hot
0.3
0.1 0.1
0.2 0.4
0.7 0.5
S M L
Figure 3.2: Example of an HMM modeling the temperature in terms of tree growth rings.
T
#
Pr(S, O) = (i1)B(i1, 1) A(it1, it)B(it, t).
t=2
Initially, we set
score(j, 1) = (j)B(j, 1), j = 1, 2,
and since 1 = 1 we get score(1, 1) = 0.45 0.8 = 0.36
and score(2, 1) = 0.55 0.3 = 0.165.
Then
score(1, 2) = max{tscore(1, 1), tscore(2, 1)}
= max{0.2016, 0.0330} = 0.2016,
and
Since 3 = 1, we get
and
Since 4 = 2, we get
score(1, 4) = max{0.0158, 0.0009} = 0.0158,
and
score(2, 4) = max{0.0237, 0.0095} = 0.0237,
and pred(1, 4) = 1 and pred(2, 4) = 1.
for i = 1, . . . , n.
B = (B(i, j)) is an n m matrix called the state ob-
servation probability matrix (also called confusion
matrix ), with
m
$
B(i, j) 0, 1 i, j n, and B(i, j) = 1,
j=1
for i = 1, . . . , n.
3.1. HIDDEN MARKOV MODELS (HMMS) 113
is maximal.
is maximal.
begin
for j = 1 to n do
score(j, 1) = (j)B(j, 1)
endfor;
for t = 2 to T do
for j = 1 to n do
for k = 1 to n do
tscore(k) = score(k, t 1)A(k, j)B(j, t)
endfor;
%
score(j, t) = k tscore(k)
endfor
endfor;
%
tprob = j score(j, T )
end
126 CHAPTER 3. HIDDEN MARKOV MODELS (HMMS)
start
0.13 0.87
0.33 0.9
0.67
Cold Hot
0.1
0.05 0.8
0.95 0.2
N D
Figure 3.3: Example of an HMM modeling the drinking behavior of a professor at Harvard.
3.2. THE VITERBI ALGORITHM AND THE FORWARD ALGORITHM 127