One of the HMM series: Introduction

Source: Internet
Author: User

Transferred from: http://www.comp.leeds.ac.uk/roger/HiddenMarkovModels/html_dev/main.html IntroductionOften, we want to find patterns that often occur over time, such as commands that a person often uses, phrase sequences in sentences, and voice sequences in speech. This series of articles contains three parts: 1. Introduction of probabilistic mode system; 2. The predicted system differs from the observed system; 3. Examine problems that can be solved by modeling systems, such as weather forecasts. Http://www.comp.leeds.ac.uk/roger/HiddenMarkovModels/html_dev/gen_patterns/s1_pg1.html Generating PatternsPatterns are divided into deterministic patterns deterministic Patters and non-deterministic patterns non-deterministic. Deterministic mode, in which the next state is related to the previous 1 states, such as traffic lights, state transitions are always: red-green-amber for non-deterministic patters, such as weather changes, there are four kinds of weather: Cloudy, sunny, rainy, depending on the historical weather and can not accurately predict the weather, so is non-determinstic.     To simplify the problem, make Markov assumption: The current state, only depends on some of the past state. This will allow you to predict the future weather by the weather of the past few days.     Markov process: The state transition of a system depends only on the N states of the past, and such a model becomes order n. The simplest Markov process is the first order process, which is not the same as the deterministic model, where the state change is the probablistic of probability rather than the certainty.     for a first order process with M states, there are M2 jumps. Each jump transition has a state transition probablity status transition probability. These probabilities form the state transition matrix, and the probability is not changed over time, which is an important assumption.     For weather examples, the matrix may be:    in order to initialize such a system, it is necessary to define what the weather is, assuming that the initial state is     i.e. on the first day (time = 0), the weather is sunny.     Now defines a first order Markov prcocess with three features:    1. Status: Three states, Sunny, cloudy,rainy    2.π vector π Vector: Defines the probability of each state in its initial state;    3. Status Jump Matrix state transition matrix http://www.comp.leeds.ac.uk/roger/ hiddenmarkovmodels/html_dev/hidden_patterns/s1_pg1.html  Patterns generated by a hidden process1. limitations of a Markov processMarkov process may not have enough power to achieve the predicted effect, when it was found that the state of seaweed algae is related to the weather, when we have two sets of states: observable state (the state of the algae), hidden state (the state of the weather).    We want to develop an algorithm to predict weather, utilize the state of algae and Markov assumption. In this case, the observable state sequence and the hidden process are probabilistic related. We use the hidden Markov model to model this process, hidden Markov process changes over time, the observable state and hidden states are relevant. 2. Hidden Markov Modelis a hidden and observable state, here the assumption: 1.true Wheather is First order Markov process;  The 2.true wheather are connected together. In each weather state, the seaweed has four states, representing the matrix of the state probabilities of seaweed in each state called Conusion matrix. Given the probability of each hidden state,observable states. The probability of each line and is 1

Definition of a hidden Markov modelA hidden Markov model hmm is a ternary group: the probabilities of (π,a,b) in A and B are independent of time, which is the most unrealistic hypothesis of hmm. Uses associated with HMMs3 types of problems can be resolved: 1. Given a hmm, the probability of the occurrence of a sequence of observations is calculated; 2. Identify the implicit state sequence (sequence of hidden states) of the observed sequence (observed sequence) that is most likely to occur 3. For a given observation sequence, generate HMM 1.EvaluationGiven a set of HMMs, and an observation sequence. We may want to find out which HMM is most likely to produce this observation sequence, for example, in the weather example, under different seasons, the weather behaves differently, and we may want to infer the weather by observing the sequence. Using forward algorithm to calculate the probability of the observed sequence under each Hmm, select the hmm with the greatest probability. This kind of problem appears in the speech recognition (speech recognition) problem, a large number of HMM will apply, each HMM represents a word. An observation sequence is represented by a word that can be determined by identifying the most probable HMM. 2.DecodingIn other applications, we may be interested in a given observation sequence, the most likely implicit state sequence (sequence of hidden states). In the example above, a blind hermit can only feel the state of the algae, but want to know the state of the weather, that is, hidden states. Given the observed sequence and the Hmm, we use VITERBI algorithm to find the best possible sequence of hidden states Viterbi algorithm a broad application is NLP (Natural Language Proc essing), semantic tagging of words (verb, adj, noun,etc). The words in a local sentence represent the observation sequence, and the semantic classification represents the hidden states. By finding the most probable hidden states, you can find the most likely semantic syntactic class for a single word. You can then do a lot of other work, such as identifying semantics. 3.LearningThe hardest problem is to find a suitable HMM, i.e. (π,a,b), given an observation sequence and the corresponding hidden states sequence. Forward-backward algorithm is used in cases where neither a nor B is known.

One of the HMM series: Introduction

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.