PGM: Graph-Free model: Markov network

Source: Internet
Author: User
Tags nets

http://blog.csdn.net/pipisorry/article/details/52489321

Markov Network

Markov networks are commonly referred to as Markov random fields (Markov random field, MRF) in computer vision.

Markov network is a method to characterize the joint distribution on X.

Like Bayesian networks, a Markov network can be seen as defining a series of independent assumptions determined by the graph structure.

Phi Blog



An example of a misunderstanding of a non-graphic model
P-map

An example of a Bayesian network that cannot be built

X1 said the student misunderstood the concept, X0 said no.

Example 3.8

NOTE: The BD in which the only given C is dependent on each other. [PGM: Bayesian network ]

The solution to the problem of the non-graph model of misunderstanding example


the distribution between 22

Distribution function

a joint distribution that is designed and calculated

The probability of the joint distribution in Figure 4.2 is so designed and calculated:

By summing on the ACD, it is concluded that P (b1) = 0.732 and P (b0) = 0.268

If the observed Charles did not produce a misunderstanding c0, then P (B1 | c0) = 0.06.

The independence of the distribution decomposition

Phi Blog



parameterization of

Factor decomposition is defined as the factor and product (usually nonnegative function) on a group of graphs.

The definition of factors and their domains


Parametric parameterization of paired pairs

A non-generic representation of the parameterization? That is, if the main application of adulthood to the Markov network, then the equivalent of the group potential parameterization?

Note: In paired parameterization, the number of parameters is squared with the number of variables.

Factor product

More general and reasonable factor design? That is, to design a factor between 22, and then by multiplying the factor between 33 or greater set? Should be applied to the parametric calculation of the cluster potential.

Gibbs Distribution and Markov network

{A more generalized concept of the factor product defines the parametric distribution of the graph, and how to treat a Bayesian network as a Gibbs distribution. }

Gibbs Distribution and division function

Note: By introducing evidence into the Bayesian network, the non-normalized metric that can be obtained is also a Gibbs distribution, and its distribution function is the probability of evidence.

Examples of other factors affecting distributions


Parametric decomposition of cluster potential: factorization of complete sub-graphs

{One of three ways to parameterize a Markov network}

[ graph theory ]

If the factor is only associated with the complete sub-graph according to the above definition, the independence hypothesis derived from the network structure is not compromised. (will be defined later)

The cluster potential parameterization will use the domain as a factor of the entire graph, thus requiring an exponential order of magnitude parameters.

Examples of Markov nets and their clusters

Pairs of Markov nets

Simplified Markov Network

Reduction definitions and examples


。。。 Not understand yet

The Markov net of the reduction

Application of Markov network in computer vision

Image denoising, de-blurring, three-dimensional reconstruction, object recognition

Object Recognition (Image segmentation)


Phi Blog



the independence of Markov networks

The parameterization of the non-graph model class is associated with the graph structure, which is used to obtain the independent nature of the distribution. The non-direction graph gets the visual interpretation of the interaction impact class. An illustration is an introduction to an expression of an independent assertion.

In the non-graph model, it is proved that the graph can be regarded as a data structure by specifying the probability distribution in the form of factor decomposition. }

Basic independence: The principle of the separation of global independence from non-direction graphs

Intuitively, in a Markov network, the probability effect flows along a non-direction path in the graph, but when we take the conditions for some of the intermediate nodes, the flow will be hindered.

Limitations of expression: Some independent models cannot be expressed in a Markov network structure


Reliability

Reliability Soundness: The arbitrary distribution of factor decomposition on G satisfies the independence conditions contained in the separation.


Note:i-map, the independent set associated with P.

That is to say, the independence p on G is satisfied, but the independent G on P is not necessarily satisfied.

[PGM: Bayesian network ]


Completeness of


Independence review

Two local Markov assumptions: paired independence and Markov blanket


Relationship between three kinds of Markov properties

The definition of the set of three independent assertions (Ip (h), Il (h), and I (h)) associated with the network structure H mentioned earlier is equivalent to the positive distribution.


From distribution to diagram

{Use graph structure to encode independence in given distribution p}

Similar to Bayesian networks

Strong completeness inverse example: Markov nets can not replace the Bayesian network? Example 4.8


Phi Blog



Parametric review

Markov network Parameterization Method: The product of the group potential, the product of the factor graph representing the factor, and the feature set representing the product on the feature weight.

A method of fine-grained parameterization

{Markov network parametric substitution representation of some methods, replacing the previously mentioned (paired parameterization and) cluster potential parameterization}

Factor graph

The Markov network structure usually does not reveal all the structures in the Gibbs parameterization. In particular, it is not possible to know from the graph structure whether the factor in the parameterization contains the largest group or contains a subset of it.



A is the largest regiment, and B is a regiment, so it is different?

Logarithmic linear model

Factor conversion to logarithmic space


NOTE:LN (30) = 3.4

Characteristics and indication characteristics


Logarithmic linear model

Discussion of parametric methods

Clearly, each representation is more granular and equally rich than the previous one. The factor graph can describe the Gibbs distribution, and the special collection can represent all the table values in each factor of the factor graph.

Metric Markov Random Airport


Over-parameterized

The same distribution can be represented by an infinite number of methods to represent a Markov network (given a structure), from which a parameterized method can be chosen as our choice for this distribution.


Standard parameterization

A formula for calculating the standard energy function


Example of a standard energy function for a misunderstanding example



NOTE:LN (1.4*10^-6) = 13.49

The standard parameterization defines the same distribution as the original distribution P

Conclusion: for positive distribution, all 4 conditions-factor decomposition and three-class Markov hypothesis are equivalent .

Elimination of redundancy

{Another method for over-parameterization}

。。。

Phi Blog

from:http://blog.csdn.net/pipisorry/article/details/52489321

Ref


PGM: Graph-Free model: Markov network

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.