Markov with airport __ machine learning

Source: Internet
Author: User
Stochastic ProcessIn the vast world of modern science and society, one can see a mathematical model called a random process: from the fluctuations of galactic brightness to the distribution of matter in the galactic space, from the Brownian motion of molecules to the evolution of atoms, from chemical reaction kinetics to the theory of telephone communication, from the spread of rumors to the prevalence of infectious diseases, From market forecasts to password deciphering, stochastic process theory and its applications are almost ubiquitous. The first process model theoretically proposed and studied in human history is Markov chain, which is another great contribution of Markov to probability theory and even the development of human thought. [1] Stochastic process is a method of describing the random motion process of a particle in a space. It is a quantitative descriptive description of a series of random event dynamic relationships. Stochastic processes are closely related to other branches of mathematics, such as differential equations and complex functions. It is an important tool for studying stochastic phenomena in the fields of natural science, project science and Social Science. [2] Markov stochastic processes and Markov chains Markov process, refers to the next point in time the value is only related to the current value, and has no relationship with the past, that is, the future depends on the present rather than the previous. In a popular metaphor, a mouse that has been removed from the brain is a chain of Markov chains that move up and down a number of caves. Because the white mouse has no memory, the idea of an instant birth determines it to leap from one cave to another; when its location is determined, the next step is irrelevant to the path it has previously passed. The philosophical significance of this model is obvious, with the words of the former Soviet mathematician Sinchin (1894-1959)), that is, to acknowledge that there is such a phenomenon in the objective world, whose future is determined by the present, so that our knowledge of the past has no effect on this determinant. Under the condition of known "present", the characteristic of "future" and "past" are called Markov, the stochastic process with this property is called Markov process, and its most primitive model is Markov chain. To put it another way: Markov stochastic process is a kind of stochastic process Markov stochastic process is a kind of stochastic process. Its original model Markov chain was proposed by the Russian mathematician A.a Markov in 1907. The process has the following characteristics: The future evolution (in the future) of a known current state (now) does not depend on its previous evolution (in the past). For example, the change of animal head number in forest is composed of Markov process. In the real world, there are many processes that are Markov processes, such as the Brownian motion of particles in liquids, the number of infectious infections, the number of waiting stations, etc., which can be considered as Markov processes. A study of the process, 1931 A.H Kolmogorov in the paper "The Analytic Method of probability theory", first of all, the differential equation analysis method is applied to this kind of process, which lays the theoretical foundation of Markov process. The theory of stochastic differential equation established by Ito before and after 1951 has opened a new way for the research of Markov process. Before and after 1954, W. was introduced to the study of Markov process by semigroup method. Markov processes and Markov vector fields on manifolds are the fields to be studied in depth. In practice, people often encounter stochastic processes with the following characteristics: The future evolution (in the future) is not dependent on its past evolution, given its current state (now). In this known "Now" condition, the "future" and "past" independent properties are called Markov, and stochastic processes with this nature are called Markov processes. The jump of a frog in the Lotus pond is a visualized example of a Markov process. The frog jumps from a lotus leaf to another lotus leaf in accordance with its momentary or thought, because the frog has no memory, and when the position is known, the next step it jumps to is irrelevant to the path it has traveled. If the lotus leaf is numbered and x0,x1,x2,... The first and second 、...... of the lotus leaf number at the beginning of the frog were indicated. After jumping the lotus leaf number,Then {xn,n≥0} is a Markov process. The Brownian motion of the particles in the liquid, the number of infectious infections, the jump of a free electron in the electron layer in the nucleus, the process of population growth, etc. can be considered as Markov processes. Some processes, such as certain genetic processes, can be approximated by Markov processes under certain conditions. [1]   Markov Random AirportMarkov Random Airport (Markov Random Field) contains two layers of meaning. Markov property: It refers to a sequence of random variables sorted by time, the distribution characteristics of the n+1 moment, independent of the value of the random variable before n time. Take the weather for an analogy. If we assume that the weather is Markov, it means that we assume that today's weather is only associated with the probability of yesterday's weather, and it has nothing to do with the weather the day before yesterday and the day before yesterday. Others, such as communicable diseases and rumors of the spread of the law, is Markov. With the airport: When a value is assigned randomly to a phase space in each position, it is called a random airport. We might as well use the land to make an analogy. There are two concepts: location (site), phase space (phase spaces). "Position" is like an acre of farmland; "Phase space" is like a variety of crops. We can plant different crops for different places, which is like giving each "location" to the airport, giving different values in the phase space. So, the tacky point is that the airport is what crops are planted in which field. Markov random field: Markov field is a stochastic land-use analogy with Markov characteristics, if the kind of crops planted in any field is related only to the type of crops in the field in which it is located, and not to the species of the crops elsewhere, then the collection of crops in these fields is a Markov random airport. [1] Mathematical DescriptionEdit Markov Random AirportMarkov random field is obtained by adding Markov property on the basis of random field. To map Markov with the airport to a graph without direction, the nodes in this graph are all related to a random variable, and the side of the node represents the relationship between the random variables related to the two nodes, so Markov random field has some relationship factors to be considered. Others, however, may not be considered. Markov random variable with the airport, only with its adjacent random variable, and those not adjacent to random variables.
The neighboring system is set to S, if the airport satisfies the following conditions: (1); (2) The Markov random field of the neighborhood system is called X as the local characteristic of Markov random airport. Markov with the airport, also called the Markov network. The non-direction graph model is also called Markov random field (Markovrandomfields) or Markov network (markovnetwork), which has a simple independent definition: Two node sets a, B are mutually independent from the given third node set C, A, The paths between the B nodes are separated by the nodes in C. In contrast, the forward graph model is also called the Bayesian Network (bayesiannetworks) or Belief network (Beliefnetworks), and the forward graph model has a more complex concept of independence. Formally, a Markov network includes: (1) a g= graph (v,e), each vertex v∈v represents a random variable in the set, each edge {U,v}∈e represents a dependency between the random variable U and V. (2) A set of functions (also known as a factor or a group factor is sometimes called a feature), each defined field is a group or a subgroup K of Fig G. Each is a mapping from a possible specific joint assignment (to Element K) to a nonnegative real number. Joint distribution function: The joint distribution (Gibbs measure) can be expressed as a Markov network: a vector, a random variable, in the state of the K-group (the number of nodes contained in the K-group), the product includes all the groups in the graph. Note that Markov properties exist within a group, and there is no dependency between the clusters. Here, Z is the division function, which has in fact, the Markov network contact is often represented as a logarithmic linear model. By introducing the feature function, we get and divide the function, which is the weight, the potential function, the mapping group K to the real number. These functions are sometimes called Gibbs potentials; the term potential is derived from physics and is usually literally understood as the potential energy produced in the vicinity of the position. Logarithmic linear model is a convenient way to explain potential energy. A model like this can be simple to express a lot of distribution, especially in a very large area. On the other hand, the negative likelihood function is a convex function also brings convenience. But even if the logarithmic linear Markov network likelihood function is a convex function, the gradient of the likelihood function still needs model inference, which is usually difficult to compute. [1] Markov PropertiesMarkov networks have such Markov properties: the vertex u in the graph depends only on the nearest pro node of the vertex u, and the vertex U is conditionally independent of any other nodes in the graph. This property is represented by the nearest pro node set of the vertex U, also known as the vertex U's Markov blanket. [3] The characteristics of Markov Random airportThere are several distinct features of the edited Markov random airport:
(1) In Markov model, the spatial relationship of pixels can be transmitted through the interaction between pixels, thus the relation between the pixels of low order Markov random field can be used to describe;
(2) In Markov random field model not only can express the randomness of the image, but also can express the underlying structure of the image, so the nature of the road scene can be very well expressed;
(3) Markov random field model, which is based on the physical model, is also directly related to the data of the road scene image (grey value or feature).
(4) Besag's in-depth study of MRF, the relationship between the Gibbs distribution and the random field of Markov is obtained, which makes Markov with the airport and energy function related together;
(5) To solve the uncertainty problem of Markov random field description, using statistical decision, estimation theory and Bayesian theory, the priori knowledge of road scene is expressed by prior distribution model, and the maximum posterior estimation is used as the standard of road scene segmentation. [4

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.