In the previous blog. In this lecture, we will solve the second problem: Hmm decoding problem, that is, the given observation sequence o = o1o2o3... OT and model parameter λ = (a, B, π), how to find the optimal implicit state sequence s that satisfies the meaning of the observed sequence, the most common algorithm in this step is the Viterbi algorithm.
Similarly, we first introduce three conformances:
: Indicates that t is in the status J at the time of observation, and the o1o2o3… is generated along the path q1q2q3 .. QT... Maximum ot Probability.
: Indicates a state value, which generates the preceding state value, that is, the last state is used for calculation.
: Indicates the largest state in t at the time of observation, so it is also a state value.
From the above explanation, we can conclude that the three conforming mathematical expressions are as follows:
In all cases, when the observed sequence is known and the Viterbi algorithm is used to solve the optimal state sequence, it is very similar to the algorithm used to calculate the maximum probability of observed values in the previous section. It is just that when the request is made, it is not to add its source, but to take the largest one.
To put it bluntly, let's look at the question at the beginning:
Hmm models are as follows. When we use the forward algorithm to calculate the observed symbol sequence o = {Abab}, we use the Viterbi algorithm to find the largest possible state sequence.
Of course, the initial probability matrix π = (1, 0, 0), that is, the initial probability matrix is in the state 1. According to the formula theory above, we solve them in sequence and. The solution is as follows:
First observation:
During the second observation:
During the third observation:
During the fourth observation:
The recursive result is:
Therefore, the final result sequence is S1, S2, S2, and S2.
The calculation result is as follows:
The light green arrow indicates the most likely sequence of States. The writing is messy, so you can try it out.