Tree_node_1_m
The class structure of the classification tree. Very simple. There are only five members.
For the left Tree, only right_constraint has a value. For the right tree, only left_constraint has a value.
In fact, this is a multi-purpose class.
For example, a cart tree with a trained maximum depth of 3 (max_split = 3) has four nodes, each of which is an object of the tree_node_w class, at the same time, each node is a weak classifier)
If the number of training cycles is 100 (max_iter = 100), there will be 100 cart trees, that is, 400 weak classifiers. Each classifier has a corresponding weight.
function tree_node = tree_node_w(max_split)tree_node.left_constrain = [];tree_node.right_constrain = [];tree_node.dim = [];tree_node.max_split = max_split;tree_node.parent = [];tree_node = class(tree_node, 'tree_node_w') ;
Realadaboost. m
Weight Distribution Initialization
Learners = {}; Weights = []; distr = ones(1, length(Data)) / length(Data); final_hyp = zeros(1, length(Data));
Round-Robin training 100 times
Each time we get a 4-node cart tree (that is, four Weak classifiers)
The first node is root without parent. The other three nodes have parent, and eventually return to root.
Use each weak classifier to classify training data and calculate/adjust the Alpha value based on the formula.
Adjust the weight distribution distr based on the final classification result of the cart tree
Normalized Weight Distribution
for It = 1 : Max_Iter nodes = train(WeakLrn, Data, Labels, distr); for i = 1:length(nodes) curr_tr = nodes{i}; step_out = calc_output(curr_tr, Data); s1 = sum( (Labels == 1) .* (step_out) .* distr); s2 = sum( (Labels == -1) .* (step_out) .* distr); if(s1 == 0 && s2 == 0) continue; end Alpha = 0.5*log((s1 + eps) / (s2+eps)); Weights(end+1) = Alpha; Learners{end+1} = curr_tr; final_hyp = final_hyp + step_out .* Alpha; end distr = exp(- 1 * (Labels .* final_hyp)); Z = sum(distr); distr = distr / Z; end