One: Important principles
(1) Chain rules:
(2) Bayesian theorem:
(3) Condition independence between variables:
Two: Major issues
Probabilistic inference of 2.1 Bayesian networks
2.2 Structural Learning: Discovering graph relationships between variables
Structure Learning algorithm:
(1) K2 algorithm: Learn Bayesian network structure by finding the parent node set for each node. It continually adds nodes to the parent node and selects a set of nodes that maximizes the federated probability of the data and structure.
(2) hillclimbing (operators:edge addition, edge deletion, edge reversion) starts with an unbounded structure, and at each step it adds an edge that maximizes the BIC. The algorithm stops when the structure score is no longer raised by adding an edge.
(3) Missing data structure Learning algorithm: SEM
Instead of optimizing the structure and parameters of the model at the same time, the SEM does the optimization of the structure and parameters once the model structure is fixed. Objective: To reduce computational complexity.
2.3 Parametric Learning: Determining the quantitative relationships among variables that are interrelated
(1) Maximum likelihood estimation is completely based on data and no prior probabilities are required
(2) Bayesian estimation assumes that the network parameters are subject to a prior distribution before the data is considered. Prior subjective probability, its influence decreases with the increase of data volume.
(3) Maximum likelihood estimation of missing value data: EM algorithm (iterative algorithm)
2.4 Classification
2.5 implicit variable and implicit structure learning
Research on the subject paper--Bayesian network