Here are a few of the frequently used kernel function representations:
Linear core (Linear Kernel)
Polynomial nucleus (polynomial Kernel)
Radial basis kernel function (Radial Basis functions)
Also called the Gaussian nucleus (Gaussian Kernel), because it can be seen as a form of the following kernel function:
A radial basis function is a real-valued function that takes a value that depends only on a particular point distance, that is.
A function that satisfies an attribute φ is called a radial function, and the standard generally uses Euclidean distance, although other distance functions are also possible. So the other two commonly used kernel functions. The power exponent nucleus, the Laplace nucleus also belongs to the radial basis kernel function.
In addition, the radial base nucleus, which is less frequently used, also has a anova nucleus. Two times the rational nucleus. Multi-two-time core. Inverse multivariate two cores.
Power exponent nucleus (exponential Kernel)
Laplace nucleus (Laplacian Kernel)
ANOVA nucleus (ANOVA Kernel)
Two-time rational (rational quadratic Kernel)
multiple two-core (multiquadric Kernel)
Inverse multivariate two nuclei (inverse multiquadric Kernel)
Another simple and useful one is the Sigmoid nucleus (Sigmoid Kernel)
Some of the above are more often used. Most of the available parameters are set directly in Svm,svm-light and RANKSVM. There are some other infrequently used, such as the wavelet kernel. Bayesian nucleus. Can be specified by the code itself.
(reprint please indicate author and source: Http://blog.csdn.net/xiaowei_cqu do not use for commercial use without consent)
"Pattern Recognition" SVM kernel function