This section is about the nuclear svm,andrew Ng's handout, which is also well-spoken.
The first is kernel trick, which uses nuclear techniques to simplify the calculation of low-dimensional features by mapping high-dimensional features. The handout also speaks of the determination of the kernel function, that is, what function K can use kernel trick.
In addition, the kernel function can measure the similarity of two features, the greater the value, the more similar.
Next is the polynomial Kernel, which requires attention to the coefficients and constants of the kernel functions, which can affect the final margin.
Then the Gaussian kernel, which can map the original data to infinite dimensions! However, if the parameter selection is not good, there will be a fitting.
Finally, several kernel were compared and each had advantages and disadvantages. It should be noted that the linear kernel should be the first object to try.
Reference: http://www.cnblogs.com/xbf9xbf/p/4621769.html
Http://www.cnblogs.com/bourneli/p/4202423.html
Coursera Machine Learning Techniques Course Note 03-kernel Support Vector machines