Probabilistic SVM and Kernel Logistic Regression (KLR)

Source: Internet
Author: User

This article is about the relationship between SVM and logistic regression.

(a) Introduction to SVM algorithm

First, we comb the SVM (in general, SVM refers to the Soft-margin SVM) algorithm.

What is the optimal target for this algorithm? We know that this goal must be related to error measurement.

So, in SVM, how to measure the error ? That is: What does ε represent in SVM?

The goal of SVM is to minimize the upper type. We use it to measure error. Does this look a little familiar? We are in the regularzation, the goal of minimizing is also such a form. But the two ideas are different: for regularization, our goal is to minimize the error, but we also want to limit the length of |w|;

For SVM, our goal is to minimize |w|, but we also want to limit the error.

The specific aspect of the weight is greater, for regularization, can be adjusted with λ, for the SVM, can be adjusted with C.

Overall, the same way, but using the SVM method , even if the above nonlinear error measurement, we can also use the QP tool to solve; second, we can use the Kernel function tool .

In particular, the error is measured in comparison to 0/1 error:

We found that this error measurement is also a upper bound of 0/1 error. Where have we seen a similar scenario before? Squared error and cross-entropy error.

We can see that the SVM's error is measured in a similar way to the value of cross-entropy error. So we say svm≈l2-regularized logistic regression.

(ii) Probabilistic SVM

How to fuse SVM and logistic regression?

I also don't know why SVM is linked to the logistic regression. What are the advantages of the logistic regression compared to SVM? Is the maximum likelihood? Isn't it good to use SVM directly?

Neither of these methods is good enough to absorb the benefits of two different methods.

(iii) Kernel logistic regression

Suppose we merge the logistic regression with SVM, mainly to use the kernel function tool of SVM in the logistic regression. So, now the question is: can you directly do kernel logistic regression?

First of all understand: to use kernel trick, there must be: W can be represented by N data. Also namely: Optimal w can be represented by Zn.

What uses this situation to be satisfied?

Thus, we can do kernel logistic regression:

Probabilistic SVM and Kernel Logistic Regression (KLR)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.