A Free Trial That Lets You Build Big!
Start building with 50+ products and up to 12 months usage for Elastic Compute Service
This section briefly describes how LIBSVM is used. About LIBSVM seems to have used, at that time mainly with LIBSVM for simple face recognition experiments . At that time also translated about LIBSVM inside the MATLAB English document
So now the latest version of LIBSVM is 3.2.0, as follows:
The downloaded LIBSVM actually contains a good number of platform Toolbox software, C++,matlab,java,python have. Their functions are used the same way.
Then after the download, click on the inside of the MATLAB under the platform, directly click on the inside of the MAKE.M function can be. Under normal circumstances if your MATLAB contains the compiler platform, you can run directly, if not, you also need to choose a platform Mex-setup. Small reminder, this compilation process should not be used under the C-disk, that is, LIBSVM do not put in the C-drive, involving permissions, the machine does not let the compilation. After compiling, add the compiled folder and its contents in the setup path of MATLAB, then you can use it. The process of normal compilation is this:
In the human face recognition experiment above, we introduced the main function, here to put it in a piece, take it over there:
The current version of LIBSVM (3.2.0) in MATLAB after compiling only four functions, Libsvmread,libsvmwrite,svmtrain (MATLAB comes with a toolbox with the same name of the function), svmpredict.
(1)Libsvmread is mainly used for reading data
The data here is non-MATLAB. MAT data, such as. Txt,.data and so on, this time need to use the Libsvmread function to convert to MATLAB identifiable data, such as the data is Heart_scale data, There are two ways to import into Matlab, one using the Libsvmread function, in MATLAB directly libsvmread (Heart_scale), the second way is to click on the "Import data" button Matlab, and then directed Heart_ Scale location, direct selection is possible. Personal feeling the second way is super good, no matter what data, such as the data you download in which database, how to turn it into MATLAB under the data? Because some data libsvmread read, but ' import data ' can become MATLAB under the data.
(2)libsvmwrite writing function is to save the known data.
Used in the following ways: Libsvmwrite (' filename ', label_vector, Instance_matrix);
Label_vector is a label, Instance_matrix is a data matrix (note that this data must be a sparse matrix, that is, the data inside does not contain useless data (such as many 0), such data should be removed and then saved.
(3) Svmtrain training function, the
of the training data generation model is generally used directly as follows: Model=svmtrain (label,data,cmd); label is label, Data for the training data (there is a fastidious, each behavior of a sample of all the data, the number of columns represents the number of samples), each sample must correspond to a label (classification problem is generally two classification problem, that is, each sample corresponding to a label). CMD is the corresponding command set, what are the commands? Many,-v,-t,-g,-c, and so on, different parameters represent different meanings, such as for the classification problem, here-T is the choice of the kernel function type,-t=0 linear kernel. -t=1 polynomial kernel,-t=2, radial basis function (Gauss),-t=3,sigmod kernel function, a new version of the-t=4, is expected to calculate the core (not yet);-G is the parameter coefficient of the kernel function,-C is the penalty factor coefficient,-V is the number of cross-validation, the default is 5, This parameter in Svmtrain write out use and do not write out when not used, the model came out of something different, do not write when, model for a structure, is a model, can be brought into the svmpredict directly use, write out when, come out is a training model accuracy rate, As a numeric value. In general, these parameters are important, there are many other parameters, you can refer to the online comparison of the whole, because the following method of face recognition using so many parameters, the other is not written.
(3) svmpredict training function, using the trained model to predict the data type .
[predicted_label,accuracy,decision_values/prob_estimates]= svmpredict (Testing_label_vector, Testing_instance_matrix,model, ' libsvm_options ')
[Predicted_label]=svmpredict (Testing_label_vector, Testing_instance_matrix, model, ' libsvm_options ')
in the first way, the output is three parameters, the type of prediction, the accuracy rate, the evaluation value (non-classification problem is used), the input as the test type (this can be with no, if not, So the accuracy of the prediction accuracy is meaningless, if there is, then you can use this value and the predicted value of the type to compare the accuracy of the accuracy, but to illustrate the point is, regardless of whether this value is not used, it has to be added, even if not, Also want to add a type value, anyway you don't care if it is right, this is the function of the specified, and then the input data value, and finally the parameter value (here the parameter value only two options,-p and-B parameters), once encountered a problem like this, For example, if I set the-G parameter to 0.1 in the training function, do I have to specify this parameter when I predict it? When you specify, the program is wrong, reminding that there is no svmpredict-g parameter, because a model will appear after Svmtrain, and in svmpredict you have used this model, and this model already contains all of your training parameters, So the svmpredict does not have this parameter, then the Libsvm_options is the-p and-B parameters. For the output of the function, two ways to call the method is not the same, the first call to all the required data are called out, the second kind of call, only called the Predicted_label prediction type, here we can see, in the Simple classification prediction model, in fact, the second way better, Both simple and practical.
To this, four functions in the classification of the introduction of the problem, of course, there are many details can be optimized not detailed, such as can be used again when those parameters, if you do not specify parameters, all the-parameters are used by default, the default may not be the best, so it involves how to optimize this parameter.
Use the introduction here, the following, the sample set selection of the previous use of the 200 non-linear sample set, the function is as follows:
%%% * LIBSVM Toolbox simple to use%Percent Load Data% * Final data format: M*n,m sample count, N dimension% * label:m*1 Label-1 and 1 of these two categoriesClcclearclose alldata = Load (' Data_test1.mat ');d ATA =data.Data ';% Select the number of training samplesNum_train = the;% structured random selection sequenceChoose = Randperm (length(data)); Train_data = data (choose (1: Num_train),:); Gscatter (Train_data (:,1), Train_data (:,2), Train_data (:,3)); Label_train = Train_data (:,End); test_data = data (choose (num_train+1:End),:); label_test = Test_data (:,End);p redict =Zeros(length(Test_data),1);Percent ----training model and forecast classificationModel = Svmtrain (Label_train,train_data (:,1:End-1),'-T 2 ');%-T = 2 Select radial basis function CoreTrue_num =0; for I=1:length(Test_data)% as a prediction, svmpredict the first parameter, just give it to me.PredictI) = Svmpredict (1, Test_data (I,1:End-1), model);EndPercent Display resultsFigure;index1 =Find(predict==1);d ata1 = (Test_data (index1,:))';p lot (data1 (1,:), data1 (2,:), ' )or '); hold onindex2 =Find(predict==-1);d ata2 = (Test_data (index2,:))';p lot (data2 (1,:), data2 (2,:), ' )*Hold ONINDEXW = Find (predict~= (label_test));d Ataw = (Test_data (INDEXW,:)) ';p Lot (Dataw (1,:), Dataw (2,:),' +g ',' LineWidth ',3); accuracy =length(Find(predict==label_test)) /length(Test_data); Title ([' predict the testing data and the accuracy is: ', NUM2STR (accuracy)]);
As you can see, the part about SVM is a little bit, the others are auxiliary, then one result is as follows:
The data artificially sets some overlap, which is a very good result. Of course, for the LIBSVM function, there are many details, such as parameter selection, and so on, different parameters of the results are not the same, this will be for you to explore.
This is the SVM series article here, thanks to see the friend ~_~ here.
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
Decryption SVM Series (v): Simple use of LIBSVM under MATLAB
Start building with 50+ products and up to 12 months usage for Elastic Compute Service