machine_learning_cs229 linear regression Linear regression (1)

Source: Internet
Author: User

This series is from the Standford public class machine learning Andrew Teacher's explanation, attaching some of their own programming implementation and learning notes.

The first chapter Linear regression

1. Linear regression

Linear regression is a method of supervised learning.

The main idea of linear regression is to give a series of data, assuming that the fitted linear expression of the data is:

How to find the parameter θ is the only problem with this method, so we need to define a loss function:

, where m represents the number of samples, representing the group I sample, so J represents the total M sample loss function.

We must be familiar with the expression of this loss function, variance? Least squares? Yes, this is our most primitive least squares model. Our task is to make the loss function J as small as possible.

P.S: As to why we should choose J as our loss function, from the surface of the natural, of course, the choice of this function also has the probability of interpretation, the probability of interpretation will be put in later articles discussed.

2.LMS minimum mean square algorithm

Our task now is to choose the parameter θ, so that the loss function J as small as possible, it is natural that we will think of gradient descent method.

The most vivid explanation of the idea of gradient descent is that you stand on the top of the mountain, look around, find a small step in the quickest direction of the mountain, and then look around again for a small step in the quickest direction of the descent, and then go to the lowest point after many iterations.

Putting it here also requires that we first select an initial and then iterate with the gradient descent, where α represents the learning step.

The last iteration formula for a single sample is the Widrow-hoff rule we know well.

We can analyze this iterative formula, the update depends on the real value and the fitting value error, from the intuitive we can understand the gradient descent and Widrow-hoff rules.

With this rule, we can design the corresponding algorithm so that J takes the minimum value.

Method One: Batch gradient descent

The meaning is simple, each iteration iterates over all m-known samples until it converges.

Repeat until convergence{

(For every J)

}

Method Two: Random gradient descent

There is a big problem with the batch gradient drop, and when the number of data sets is very large, it takes a long time to iterate. Using random gradient descent although it is possible to take some "detours", but because each iteration only uses a set of data so compared, it can be faster convergence.

loop{

For I=1 to m{


if convergence;

}

}

In fact, in the gradient descent algorithm, there is a problem, how to control the learning rate, andrewng and not too much introduction, I may do some experiments to calculate, how to control the learning rate α to ensure convergence.

3. How to directly find Theta

In fact, we can use the matrix operation to find the parameter θ directly, but need some matrix calculation, may reopen an article to calculate this θ.

Here's a straight answer.

machine_learning_cs229 linear regression Linear regression (1)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.