SVR (Support Vector Regression) algorithm detailed

Source: Internet
Author: User
Tags svm

Perface
Here does not say from the SVM to the SVR, because know how the SVR back then, back to know the reason of SVM;
Some of the work of the summary, all the discussion has been desensitization, and work in the project, technology, patents are irrelevant;
You are welcome to advise.

Following the evolution of the algorithm gradually set out, due to more mathematics, there will be a lot of formula deduction, the following will be as far as possible from the angle of application to explain, so that the
theory more easy to understand, all reference books, papers will be posted at the end of the text, if there is any mistake, infringement, please inform.
                             
Some background of directory SVM
1, SVR correlation theory 
  1.1, nuclear mapping
    1.1.1, what is nuclear mapping.
    how 1.1.2 and kernel mappings are introduced into the SVR.
    what is the output of the 1.1.3 and kernel functions?
    1.1.4, extension: How to construct a useful nucleus (a useful kernel should meet the conditions).
  1.2, SMO (sequential Minimal optimization) algorithm
  1.3, SVR overall grasp
  1.4, SVR input and Output
2, SVR application and implementation
a bit of background for SVM


The above figure is about SVM, had to mention the three predecessors:
The left one is Vapnik, (former Soviet Union/Now Russia), a year before the dissolution of the USSR to Bell Labs, during which he proposed the famous SVM algorithm, which was mainly used for handwriting recognition, and now Dr. Vapnik was recruited by Facebook Ai to launch another round of AI research.
In the middle of the diagram, John C. Platt, if Vapnik is a pioneer, then Platt is undoubtedly an important catalyst, since Vapnik proposed SVM, which involved some kernel trick, while the industry's common practice was QP (quadratic Programming) problem solving, its computational speed, the use of resources to limit its application in large samples, although some of the existing methods (Chunking, Osuna, etc.) are designed to overcome these problems, when still to achieve the desired effect, 98, Platt first proposed the SMO (Sequential Minimal optimization) algorithm, which greatly overcomes the computational problem of SVM in large samples, and for this reason Platt is also very worthy of a predecessor in history.
The right of the Linzhiren professor, if said Platt is to Vapnik SVM from the academic to the industry, then Chih-jen Lin is from the complex engineering to the ordinary people application, the development of the LIBSVM package, is completely oriented to the ordinary user, It has multiple language implementations including (Java, MATLAB, R, CUDA, Python, etc.)
As a descendant, there are many of the ancestors of the SVM is worth remembering, in this I would like to pass the above three predecessors, for all the development of SVM for the people to pay a high tribute. 1. SVR Related theory 1.1. Kernel (kernel) mapping

About the nuclear map, in the Svm/svr is a very ingenious, if not the introduction of the idea of nuclear, then SVM/SVR is a kind of percetion learning algorithm (perceptron: About PLA, interested can Baidu, Then there is time to discuss this later).

in the SVR, there are a few things to know about nuclear mapping:

1.1.1, what is a nuclear mapping.
To be exact, nuclear functions should be a skill in most machine learning (trick, which is called in most papers, kernel trick), so it should not be too mythical, and this technique can be done directly by making convolution without explicitly specifying a specific form of nuclear mapping.
So, what is the kernel mapping (mapping function), usually there are three kinds of mappings, 1 is to do the transformation without adding or subtracting dimensions, 2 is related to the increase or decrease of the dimension, 3 is the two kinds are included. For example, suppose the mapping function Ψ (x) = ((x1) 2, (x2) 2,⋯) \psi (x) (X^{1}) ^{2}, (x^{2}) ^{2},\cdots),

How 1.1.2 and kernel mappings are introduced into the SVR.

What is the output of the 1.1.3 and kernel functions?

1.1.4, extension: How to construct a useful nucleus (a useful kernel should meet the conditions). 1.2. SMO (sequential Minimal optimization) algorithm 1.3, SVR overall grasp 1.4. SVR input and output (how to deploy the problem) Application of 2.SVR (regression and multi-classification problem)

To be sorted ....

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.