Previously, a svm-based Ordinal regression model, a special multi-classification model, was developed to show the effect of the model classification in a visual way, with different color representations for each category area. However, also read a lot of code, but the basic is to show two classification, when expanded into a multi-classification will appear problems, so my thesis finally had to draw the boundary of the figure. Today in the study of random forest, found the following demo of the MATLAB code, the code is a good implementation of the various categories of color fill, the effect is very beautiful.
Here is a Demo code: DEMO.M
Percent Generate Dataprettyspiral = true;if ~prettyspiral% generate some random Gaussian like data rand (' state ', 0); RANDN (' state ', 0); n= 50; D= 2; X1 = MGD (N, D, [4 3], [2-1;-1 2]); X2 = MGD (N, D, [1 1], [2 1;1 1]); X3 = MGD (N, D, [3-3], [1 0;0 4]); x= [X1; X2; X3]; x= Bsxfun (@rdivide, Bsxfun (@minus, X, mean (x)), VAR (x)); y= [Ones (n, 1); Ones (n, 1) * *; ones (n, 1) *]; Scatter (x (:, 1), X (:, 2), A, Y) Else% generate twirl data! n= 50; t = linspace (0.5, 2*pi, N); x = T.*cos (t); y = T.*sin (t); t = linspace (0.5, 2*pi, N); x2 = T.*cos (t+2); y2 = T.*sin (t+2); t = linspace (0.5, 2*pi, N); x3 = T.*cos (t+4); Y3 = T.*sin (t+4); x= [[x ' Y ']; [X2 ' y2 ']; [X3 ' Y3 ']; x= Bsxfun (@rdivide, Bsxfun (@minus, X, mean (x)), VAR (x)); y= [Ones (n, 1); Ones (n, 1) * *; ones (n, 1) *]; Scatter (x (:, 1), X (:, 2), N, Y) end%% classifyrand (' state ', 0); Randn (' state ', 0); opts= struct;opts.depth= 9;o pts.numtrees= 100;opts.numsplits= 5;opts.verbose= True; opts.classifierid= 2; % weak learners to use. Can be a array for mix of weak learners tootic;m= Foresttrain (X, Y, opts); timetrain= toc;tic;yhattrain = Foresttest (M, X) ; timetest= toc;% look at classifier distribution for fun, to see what classifiers were% chosen at split nodes and how Ofte nfprintf (' Classifier distributions:\n '); classifierdist= zeros (1, 4); unused= 0;for i=1:length (m.treemodels) for j=1: Length (m.treemodels{i}.weakmodels) cc= M.treemodels{i}.weakmodels{j}.classifierid; If cc>1%otherwise no classifier is used at that node classifierdist (cc) = Classifierdist (cc) + 1; else unused= unused+1; End endendfprintf ('%d nodes were empty and had no classifier.\n ', unused); for I=1:4 fprintf (' Classifier with id=%d W As used at%d nodes.\n ', I, classifierdist (i)) end%% plot resultsxrange = [ -1.5 1.5];yrange = [ -1.5 1.5];inc = 0.02; [x, Y] = Meshgrid (xrange (1): Inc:xrange (2), Yrange (1): Inc:yrange (2)); image_size = size (x); xy = [X (:) y (:)];[ Yhat, Ysoft] = foresttest (M, xy);d ecmap= reshape (Ysoft, [image_size 3]);d ecmaphard= reshape (yhat, image_size); subplot ( 121); Imagesc (Xrange,yrange,decmaphard); hold On;set (GCA, ' ydir ', ' normal ') cmap = [1 0.8 0.8; 0.95 1 0.95; 0.9 0.9 1];colorm AP (CMAP);p lot (x (y==1,1), X (y==1,2), ' O ', ' markerfacecolor ', [. 9.3.3], ' markeredgecolor ', ' K ');p lot (x (y==2,1), X (y== 2,2), ' O ', ' markerfacecolor ', [. 3.9.3], ' markeredgecolor ', ' K ');p lot (x (y==3,1), X (y==3,2), ' O ', ' markerfacecolor ', [. 3. 3.9], ' Markeredgecolor ', ' K '); hold Off;title (sprintf ('%d trees, Train time:%.2fs, Test time:%.2fs\n ', Opts.numtrees, Tim Etrain, Timetest)), subplot (122), Imagesc (Xrange,yrange,decmap), hold On;set (GCA, ' ydir ', ' normal ');p lot (x (y==1,1), X ( y==1,2), ' O ', ' markerfacecolor ', [. 9.3.3], ' markeredgecolor ', ' K ');p lot (x (y==2,1), X (y==2,2), ' O ', ' markerfacecolor ', [ .3.9.3], ' markeredgecolor ', ' K ');p lot (x (y==3,1), X (y==3,2), ' O ', ' markerfacecolor ', [. 3.3.9], ' Markeredgecolor ', ' K '); Hold Off;title (sprintf (' Train AccuracY:%f\n ', Mean (yhattrain==y));
The above specific code see: Https://github.com/karpathy/Random-Forest-Matlab
Matlab code for color filling of different classification areas (Demo:random Forest) in multi-classification problems