This article for different stages of the SVM to comb and summarize, whether it is the primary version of the SVM, or upgrade version of the SVM, you will find that its real SVM has been two core in it, I believe that after reading this article of the study, you will be the SVM this important class classifier has a comprehensive understanding, Or have their own experience, well, let's go, good luck.
1. Core Ideas
For any nonlinear method, the corresponding linear method can always be obtained if the feature is properly transformed, but this transformation sometimes brings two problems:
1) The dimension of feature space must be larger after transformation, and in most cases it will increase exponentially with the increase of the original feature dimension and the increase of the nonlinearity, resulting in the algorithm cannot continue;
2) Although the feature space dimension increases, but the original sample number does not increase, the parameters to be estimated in the decision function will also increase, resulting in a higher complexity, so the generalization ability is poor;
However, for SVM greatly, these problems that do not call the matter, who let it have two big winning magic weapon pinch, divided into minutes to solve the problem haha. SVM has two core ideas: 1) Large interval: by maximizing the classification interval to ensure good generalization ability, solve the second problem above; 2) kernel function: The nonlinear transformation of the feature is realized indirectly by the inner product function defined by the kernel function, then the nonlinear problem in the original space is solved by the linear problem in the new space, and the high dimensional operation is avoided To solve the first problem. Well, these two ideas are very, very important, they give people a lot of inspiration in the follow-up study (you will see this inspiration in the follow-up study), I hope you slowly experience.
2. Application
Now you, is not the SVM has a comprehensive understanding of it, if there is, congratulations, you can now do their own application of spicy.
In practical application, you will find that SVM has a great advantage over the traditional pattern recognition method and Ann method in classifying performance, and SVM is not sensitive to the selection of kernel function, basically the three kernel functions mentioned above have no difference in the results.
Support Vector Machine was originally published in 1992 and 1995, at that time did not attract much attention, until the end of the 90, with the rise of machine learning and pattern recognition, triggered a large wave of research on SVM. Its earliest application was the experimental project of handwritten numeral recognition conducted by its founder, Vapnik, at the at and T Labs, when the data used was USPS's handwritten digital library, and the actual test yielded the best results with an error rate of 4%. The typical applications are OCR, face recognition, text recognition, DNA sequence analysis and so on.
3. Open-source SVM Library
Some of the more influential are:
libsvm:http://www.csie.ntu.edu.tw/~cjlin/libsvm/
: http://svmlight.joachims.org/
Svmtorch:http://bengio.abracadoudou.com/svmtorch.html
Of course there are other libraries that can be found on the http://www.kernel-machines.org/software.
Patterns Recognition (Pattern recognition) Learning notes (24)--Summary: SVM Learning Resources