Distributions)
1. This is the simplest case of joint distribution, which is called the student model.
Figure 1
There are three variables: I (student intelligence, with 0 and 1 states), D (exam difficulty, with 0 and 1 states), and g (score level, there are three statuses: 1, 2, and 3 ).
The table is the Joint Distribution of probability. The table removes all rows containing a certain value to reduce the distribution table.
For example, you can remove all rows whose G is not 1, so that only 1, 4, 7, and 10 rows are left, so that the sum of their probabilities is not 1, so we can use renormalization again ). 2.
Figure 2
On the other hand, we can add all the worthwhile rows, that is, what we see in marginalization.
Figure 3
Conditional probability distribution (CPD)
Given the knowledge and difficulty of students, the score distribution is the conditional probability. 4.
Figure 4
Factor (factors)
A factor is a function of a random variable.
Factor is the basic method for processing probability distribution.
A factor is the basic unit used to define probability distribution in a high-dimensional space.
Factors can be multiplied (fig. 5), marginalized (fig. 6), and reduced (fig. 7 ).
Figure 5
Figure 6
Figure 7
The conditional probability distribution of the student model mentioned above can be drawn in a picture.
Each node represents a factor, and some CPDs have become non-conditional probabilities.
Figure 8
Chain rule)
9. Probability Distribution is defined by the product of a factor.
Figure 9
For example
Therefore, using the chain rule, Bayesian Networks can represent the joint probability distribution:
An important attribute of Bayesian Networks is probability sum 1.
A simple probability chart is a blood type model.
G indicates the genotype, and B indicates the blood type. We can see that the blood type is determined only by its own genotype, while the genotype is determined by the genotype of the two parents. 10.
Figure 10
You are welcome to discuss and follow up on this blog, Weibo, and zhihu's personal homepage. You may wish to continue to update the content ~
Reprinted please respect the work of the author and keep the above text and the link to this Article. Thank you for your support!
Probability graph model (PGM) learning notes (2) Bayesian Network-semantics and factorization