First of all, a moist grassland model, cloudy indicates whether the weather is cloudy, c=1 (F) indicates that false,c=2 (T) is true, the expression is the same, Sprinklet indicates whether the sprinkler is dispatched, rain indicates whether it rains, wetgrass indicates whether the grass is wet. The table next to it represents various conditional probabilities.
Bayesian Network representation: BNT in a matrix representation of the Bayesian network, even if node I to J has an arc, the corresponding matrix (i,j) value is 1, otherwise 0. is a damp-cause model of the grassland. We use MATLAB to draw a well-established Bayesian network, we can find the single node posterior probability and multiple node posteriori probability.
MATLAB program is as follows: The first step, the establishment of Bayesian network, using MATLAB to get the above network structure
CLEAR;CLC; N = 4; % of four nodes are cloudy,sprinkler,rain,wetgrassdag = zeros (n,n); C = 1; S = 2; R = 3; W = 4;dag (c,[r S]) = 1; The connection relationship between the% nodes dag (r,w) = 1;dag (s,w) = 1;discrete_nodes = 1:n; % discrete node node_sizes = 2*ones (1,n); % Node State number bnet =mk_bnet (dag,node_sizes, ' names ', {' Cloudy ',... ' Sprinkler ', ' rain ', ' wetgrass '}, ' discrete ', discrete_nodes); bnet. Cpd{c} = TABULAR_CPD (bnet,c,[0.5 0.5]); % manual input conditional probability bnet. Cpd{r} = TABULAR_CPD (bnet,r,[0.8 0.2 0.2 0.8]); bnet. Cpd{s} = TABULAR_CPD (bnet,s,[0.5 0.9 0.5 0.1]); bnet. CPD{W} = TABULAR_CPD (bnet,w,[1 0.1 0.1 0.01 0 0.9 0.9 0.99]);
% draw a well established Bayesian network figuredraw_graph (DAG);
Next, for example, we want to calculate the probability that the sprinkler causes the grassland to be moist. The composition of the evidence is that W=2,enter_evidenc executes a two-channel information transfer pattern. The first variable that is returned includes the modified engine that combines the evidence, and the second returned variable includes the logarithmic likelihood of the evidence. Calculate the p=p as follows (s=2| w=2):
We use the Federated Tree Engine, which is the root of all the exact inference engines. It can be called as follows:
Engine = Jtree_inf_engine (bnet); evidence = cell (1,n); EVIDENCE{W} = 2; [Engine, Loglik] = enter_evidence (engine,evidence); Marg = Marginal_nodes (engine, S); p = Marg. T (2);
According to my current understanding, evidence is the conditional part of the conditional probability that you need to calculate, and the second parameter of marginal_nodes is the probability portion of the conditional probability that you need to calculate. The above code can calculate the conditional probability P (s| w=2), stored in the variable marg. T inside.
Two small examples:
first, the direct calculation of P (W) probability
Evidence = cell (1,n); [Engine, LL] = Enter_evidence (engine,evidence); m = marginal_nodes (engine, [W]); m.t
second, calculate the joint probability P (SRW)
Evidence = cell (1,n); [Engine, LL] = Enter_evidence (engine,evidence); m = Marginal_nodes (engine, [S R W]);
This article is based on the blog click Open link based on your own understanding of the written
------------------Wish you good health and good luck
Huadian North Wind Blows
Key Laboratory of cognitive computing and application, School of Computer Science and technology, Tianjin University
No. 92nd, Wei Jin Road, Tianjin
Zip: 300072
Email: [email protected]q.com
FULLBNT one of the Learning notes (MATLAB)