Some details of hmm

Source: Internet
Author: User

This time the teacher let everyone discuss hmm, I hope you have some understanding of these basic theories, just ready to write a bit of their own things, then from this hmm.

Before in many papers have seen the use of HMM to solve some practical problems, although these are to a large extent only to add a halo, I have little attention to the basic theory of Hmm, recently seriously looked at, but also to respond to the teacher's call, at the same time to add some theoretical accumulation. There is not too much writing here, because there is a lot of information about Hmm, and it's very detailed. For completeness, I'll probably write about my understanding of the whole process of hmm.

Hmm itself is a generation model of the establishment of the problem, after the establishment of the use of HMM model to classify and identify the work. Hmm has a ternary group (PI,A,B), which represents the initial state probability, state transition probability matrix and confusion matrix respectively. There are three kinds of problems in practical application, such as valuation, decoding and learning . The key to these problems is forward and back algorithms, that is, the alpha and beta matrices are found, and the following things are well solved. To put it simply, the question of valuation is the problem of seeking alpha, the problem of decoding is to look for a path problem with the maximum weight in the alpha matrix, and the learning problem is to use the EM algorithm to iteratively compute the alpha and beta matrices until the convergence problem (seemingly someone has proposed a new learning algorithm, Here for the moment.)

Hmm itself is a very simple idea, or relatively easy to understand, below I just write down some of my understanding of the process of the details of the problem. These questions may not be fully answered myself, and if I can, I will add them after knowing the exact answer.

1, the edge problem. Different data on the definition of the state is not the same, some data on the implicit state defined a final state, and some are not defined, although the meaning is basically the same, but in use if not according to the meaning of the corresponding formula, it will produce an edge problem, that is, when dealing with t=0 or t=t problems , the definitions are different, some of the formula subscript even beyond the boundary, I think these should be practical application of the time to consider the place.

2. The training problem of multidimensional data. The teacher raised a question let's think about, take one-dimensional hmm example, if given an observation sequence, can be used to train a HMM model, if given the same class of observation sequence, this time should be how to train the HMM model. There were two approaches discussed at the time: one was to continue training with a well-trained HMM model, like an update on this basis, and the HMM model of the respective training office to average the parameters. I personally think that these two methods should not be the most ideal method, but it is not surprising that there are any other ways, this aspect I will look at, should be able to draw on other people's practice.

3, the implementation of the specific problems. In order to understand the meaning of this, I use MATLAB simple implementation of three kinds of applications: valuation, decoding, learning. In the implementation of the time only to find that it seems to know something to write it sometimes is not so clear, such as the marginal problem just mentioned. If we want to use HMM to do some research later, I hope we can understand it more thoroughly in in-depth research.

4, advanced applications. Hmm was originally used for speech research, and the effect is very good, recently in the image processing, especially the image segmentation is also used a lot. For example, in the aspect of texture segmentation, HMT is proposed, that is, the concept of hidden Markov tree, and the combination of HMT and GMM, using Hmm to divide the algorithm. This aspect I also know not very clearly, mainly is probably looked at 2001 on the IEEE Trans. IP "multiscale Image segmentation Using wavelet-domain Hidden Markov Models" , a lot of things do not understand, about the HMT problem is mainly: (1) How to simplify the text of HMT, mapping to the basic theory I see now, (2) How the training process is completed, this aspect should be mainly referred to the 1999 IEEE Trans. SP wavelet-based statistical Signal processing Using Hidden Markov Models "This article, but probably browsed a bit, talk about training, just mention some of the EM algorithm is not specifically said In short, the specific application is still not very understanding, if necessary, may also need to further study.

Basic theory should not be difficult, mainly how to apply the actual application, if you want to use HMM to do some in-depth research, this aspect should still spend a little more time to conquer, but at present things more, first slow a slow bar. (This seems to be the usual reason I don't want to see, eh, tragedy.) )

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.