Hidden Markov Model (Hidden Markov models) Series 3

Source: Internet
Author: User
Hidden Markov Model (Hidden Markov models) Series 3
  • Introduction (Introduction)
  • Generating patterns)
  • Implicit patterns)
  • Hidden Markov Model (Hidden Markov models)
  • Forward Algorithm (Forward Algorithm)
  • Viterbi Algorithm)
  • Forward-backward algorithm (forward-backward algorithm)
  • Summary

 

 

 

Hidden Markov Model (Hidden Markov models)

Definition

The hidden Markov model can be defined using a triple (π, A, B:

  1. π indicates the initial state probability Vector
  2. A = (AIJ) (hidden state) transfer matrix P (xit | XJ (t-1) T-1 moment is J and T moment is the probability of I
  3. B = (BIJ) confusion matrix P (yi | XJ) probability of observing the state of Yi due to the hidden state being XJ at a certain time point

It is worth noting that each probability in the state transition matrix is time-independent, that is, we assume that this probability is fixed and does not change with time. Of course, this is the least practical assumption of the Markov model.

 

Use of Hidden Markov Model

If a model can be described as a hidden Markov model, three problems can be solved. The first two problems are pattern recognition: 1) probability (Evaluation) of an observed state sequence based on Hidden Markov Model; 2) find a sequence of hidden states to produce a sequence of observed states with the highest probability (decoded ). The third problem is based on a sequence of observed states.SetGenerates a hidden Markov model (learning ).

 

1. Rating

Suppose we have many hidden Markov models (that is, a set of three tuples) that describe different systems and a sequence set of observed states. We may want to know which Hidden Markov Model is most likely to generate a sequence of observed states. For example, we may have a "Summer" model of seaweed and a "Winter" model, because seaweed should be in different States in summer and winter, we hope to judge whether it is summer or winter based on a sequence of observed States (whether the seaweed is wet or not.

We can useForward AlgorithmCalculate the probability of an observed sequence under a specific hmm, and then find the most likely model accordingly.

This type of application usually appears in the speech settings. We usually use a lot of HMM, each for a particular word. A sequence of observed states is obtained from a word that can be heard, and then the word can be identified by finding the HMM that satisfies the maximum probability of the sequence of observed states.

 

2. Decoding

The sequence of observed States finds the most likely hidden state sequence.

Similar to the above problem, and more interesting is to find hidden Sequences Based on the observed sequence. In many cases, our team is more interested in the hidden state because it contains valuable information that cannot be directly observed. For example, in the case of seaweed and weather, a retired person can only see the state of seaweed, but he wants to know the state of the weather. In this case, we can useViterbi AlgorithmTo obtain the optimal hidden state sequence based on the observed sequence. The premise is that there is an HMM.

Another field that widely uses Viterbi algorithms is the indexing of words in natural language. Words in a sentence can be observed, and parts of speech are hidden. By finding the most likely hidden state sequence of the word sequence in a sentence based on the context of the statement, we can get the word (most likely) of a word ). In this way, we can use this information to complete other tasks.

 

3. Learning

A hidden Markov model is obtained from an observation set.

The third problem is also the most difficult one. The sequence set observed is used to find the most likely Hmm, that is, to determine the most likely triple (π, a, B ). When matrix A and matrix B are not intuitively measurable (obtained through experience), you can useForward and backward AlgorithmsTo solve this problem.

 

Summary

Although some unrealistic assumptions have been made, the HMMs described with triplet has great value in describing the real system and analyzing it. In addition, the following problems can be solved:

  1. Use forward algorithms to find the most likely Hidden Markov Model
  2. Use the Viterbi algorithm to find the most likely hidden Sequence Based on the observed Sequence
  3. Using Backward backward algorithms to determine the parameters that are most likely to generate a hidden Markov model for an observed set

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.