turn from: http://blog.csdn.net/yangliuy/article/details/7322015
Author: Yangliuy
Decision Tree algorithm is a very common classification algorithm, it is a method to approximate discrete objective function, and the learning function is expressed in the form of decision tree. The basic idea is to select the most important attribute of the information gain to divide the sample set and construct the decision tree. The information gain is defined as the difference between the information entropy of the node and its sub node. The information entropy is proposed by Shannon to describe the information impurity (instability), and its calculation formula is
Pi is the proportion of the sample of the different sex in the subgroup (and the two-element classification is the positive sample and the negative sample). Such information proceeds can be defined as the expectation that the sample will reduce the entropy when it is divided by a certain attribute, and can distinguish the ability of positive and negative samples in the training sample, the calculation company is
I implemented the algorithm for the sample collection as follows
The table records whether or not to play in different climatic conditions, require the program output decision tree according to the table
C + + code is as follows, the program has detailed comments
[CPP] View plain copy #include <iostream> #include <string> #include <vector> #include <map> #include <algorithm> #include <cmath> using namespace std; #define maxlen 6//Enter the number of data per row //Multi fork Tree Implementation //1 Generalized table //2 parent pointer notation, Suitable for often find the parent node of the application //3 child chain representation, suitable for often find the child node application //4 left eldest son, right brother notation, to achieve more trouble //5 All children of each node use vector to save //lesson: The design of data structure is very important, this algorithm uses 5 more suitable, at the same time //Pay attention to maintain the remaining sample and the remaining attribute information, the value of transverse traverse test Loop Property, //longitudinal traversal recursive invocation vector <vector <string> > state;//Instance Set Vector <string> item (maxlen)//corresponds to one row instance set vector <string> attribute_row;//saves the first row, which is the property row data string end ("End");/Enter the ending string yes ("yes"); &NBsp; string no ("no"); String blank (""); map<string,vector < string > > map_attribute_values;//all values corresponding to the storage property int tree_size = 0; struct node{//Decision tree node string attribute;//attribute value string arrived_value;//Property Value vector<node *> childs;//All children node () { attribute = blank; arrived_value = blank; } }; node * root; //map void based on data instance to calculate attributes and values Computemapfrom2dvector () { unsigned int i,j,k; bool exited = false; vector<string> values; for (i = 1; i < maxlen-1; i++) {//Is traversed by column