Theoretical precipitation: Hidden Markov models (Hidden Markov model, HMM)

Source: Internet
Author: User

Theoretical precipitation: Hidden Markov models (Hidden Markov model, HMM)

Reference Link: http://www.zhihu.com/question/20962240

Reference Link: http://blog.csdn.net/ppn029012/article/details/8923501

Link to this blog:http://www.cnblogs.com/dzyBK/p/5011727.html

1 Questions Set

Suppose there are n dice (numbered from 1~n), each die has an M face, each face is labeled with a number and is not duplicated, and the numeric value is limited to [1,m]. (1) If there is put back to take out a dice cast, which is currently taken to which dice and the previous fetch is which dice related. (2) The probability that the same dice appear different numbers is not necessarily the same (that is, not evenly distributed), different dice appear the same number of probabilities is not necessarily the same (that is, completely n different dice).

For the condition (1) can be expressed as:

, PIJ indicates the probability of the current fetch of the J-Dice when the first dice are taken.

For the condition (2) can be expressed as:

, the QIJ represents the probability of the number J of the first-I dice.

In mathematics, the dice (or dice number) is called the implicit state of Hmm, the dice of each side (or the number on each side) is called the performance of the recessive state. The Matrix PNN is called the transition probability matrix of the recessive state, and the Matrix QNM is called the representation probability matrix of the recessive state. The hmm can be represented as {n,m,pnm,qnm}.

What's the effect of hmm? Now let's do the following.

We have put back to take K dice cast, recorded after each throw of the digital AI. To get the number sequence to A=[A1,A2,..., AK] at the end of the throw, what is the sequence of a corresponding dice? There are a number of possible, each number of the optional dice have n, so there is a total of u=k^n possible dice sequence, set to {b1,b2,..., Bu}.

Note: Because there is a roll back to take the dice, so take the dice may be repeated. The recorded numbers may also be duplicated, and two duplicate numbers may be from the same dice or from different dice.

Note: The dice number is equivalent to the dice, and the dice number sequence is equivalent to the dice sequence.

2 Basic Questions

I wonder how likely a corresponding dice sequence is BI=[B1,B2,.., BK]?

Solution: Probabilistic multiplication.

3 Decoding problems

I would like to know what is the most likely corresponding dice sequence?

Equivalent to: The dice sequence with the largest probability of the occurrence of a.

Solution One: Poor enumeration method. Enumerating the possible dice sequences that make a appear, and then calculating the probability of the occurrence of a in each dice sequence according to the solution of the basic problem, the probability of the u=k^n of the dice sequence is the most probable one. This method applies only if the number of dice is small and the number of dice faces is small.

Solution Two: Maximum likelihood estimation.

(1) to make A1 appear the most probability of the dice, easy to carry out, assuming that the B1

(2) In the case of B1 appearance, the probability that the A2 appear the largest dice, easy to carry out, assuming that the B2

(3) In the case of B2 appearance, the probability that the A3 appear the largest dice, easy to carry out, assuming that the B3

...

(n) In the case of the occurrence of bn-1, to make an appearance of the most probability of the dice, easy to carry out, assuming that the Bn

So the dice sequence is B=[b1,b2,.., bn].

4 Forecasting problems

If I have not yet taken the dice, I wonder how likely I would be to A=[A1,A2,..., AK]?

Equivalent to: Set the dice sequence bi (i=1,2,..., u) The probability that a appears is hi, then the problem is equivalent to the sum of the hi.

Solution One: Poor enumeration method. Similar to the decoding problem, except that the decoding problem is the largest one in {Hi,hi,..., hu}, and the problem is to ask for all Hi's sum.

Solution Two: Forward derivation method.

is actually an iterative process. With the probability of [a1,a2,..., AI] (i<k) appearing as GI, the probability of [a1,a2,..., ai,ai+1] appears:

Sum represents the summation of all elements within the matrix GI.

Explanation: Ai+1 may come from any one of the n dice, so there is qi+1. At the same time, no matter which dice to take, the first time to take the dice also have n kinds of possible, so there are PNN.

5 Learning Problems

All of these issues must be known to PNN and QNM. In practical applications, these two matrices are often unknown or known as part of the matrix. Learning problems is the process of determining PNN and qnm. Now suppose we have more than one set (the more the better) observation data (i.e., multiple groups of a, which can be obtained by repeated throws), and we can (approximately) determine PNN and qnm based on these observations.

See:

6 Related concepts

(1) Markov chain, Markov random field (MRF), Markov process are essentially HHM.

(2) We took the K-dice, each time to take a random variable XI, representing the dice (number), K random variable range of values composed of the set called HHM state space, with Ω. Here, ω={1,2,.., n}.

7 Application Conditions

The application HHM needs to meet three conditions:

(1) Markov hypothesis: the current (recessive) state of a system or model depends only on the previous (recessive) state:

(2) Output Independence hypothesis: the performance of the State is only relevant to this state.

(3) The hypothesis that the state is independent of the specific time

Markov hypothesis more philosophical saying is that the future depends on the present rather than the past.

8 Markov random field (MRF), Gibbs Random Airport (GRF) and the relationship between conditional random field (CRF)

Theoretical precipitation: Hidden Markov models (Hidden Markov model, HMM)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.