Least squares Learning One

Source: Internet
Author: User

In this paper, the basic least squares and least squares with constrained conditions are explained.

A basic least squares method

The least square method is the most basic algorithm in regression. It is the square error for the output of the model and the output of the training sample (also multiplied by 1/2, just to simplify the derivation) for the hour of learning.


In particular, for linear models there are:


The derivative can be:


Which design matrix:


% basic least squares clear all;close all;n = 50; N = 1000;x = Linspace ( -3,3,n) '; x = Linspace ( -3,3,n) '; y =  x.^2 + 2*x + 3 + 0.05*randn (n,1);% Design Matrix P (:, 1) = ones (n,1);p(:, 2) = x;p(:, 3) = x.^2; P (:, 1) = ones (n,1); P (:, 2) = X; P (:, 3) = X.^2;t = P\YF = P*t;figure (1); Clf;hold On;axis ([-3 3 0]);p lot (x,f, ' G ');p lot (x, y, ' Bo ');



Finally get y = 2.9896*x^2 +1.9999*x + 0.9997

The dimension of the design matrix is n*b, when the training sample is very large, the large-scale cost of biased guidance is very large, easy to calculate the memory shortage, the use of random gradient algorithm will often be very good.

Stochastic gradient algorithm is the algorithm of learning the target parameter θ along the gradient descent of the training squared error J. The following is an algorithm for least squares of linear models using the stochastic gradient algorithm:

Step 1: Specify θ initial value

Step 2: Randomly select a training sample such as (Xi,yi)

Step 3: For the selected training sample, the parameter θ is updated with a gradient descent method .

Step 4 Repeat step 2,3 until θ achieves convergence accuracy

Two least squares with constrained conditions

The basic least squares method often has the disadvantage of fitting the learning process including the noise, mainly because the learning model is too complex for the training sample, and in order to control the complexity of the model, the least squares with constrained amount is described below.

least squares of 2.1-part space constraints

In the basic least squares method, the parameter θ is obtained in the whole parameter space, and the least square method of partial space constraint is to prevent the overfitting phenomenon by constraining the parameter space to a certain range.


The matrix p is an orthogonal projection matrix, which is usually manually set by principal component analysis, and the design matrix is changed to the right multiplier p of the design matrix.

The following is an example of the least squares method for partial spatial constraints of a linear model with triangular polynomial as the basis function. The green curve is the basic least squares fitting curve result, can be seen, in order to achieve the minimum square error, the learning model for training samples is too complex, there is a fitting phenomenon, and the red curve is the result of spatial constraints, visible red curve effect is better than the green curve.


% basic least squares and least squares learning method with partial spatial constraints clear all;close all;n = 50; N = 1000;x = Linspace ( -3,3,n) '; X = Linspace ( -3,3,n) ';p ix = pi*x;y = sin (pix)./(PIX)  + 0.1*x + 0.2*randn (n,1);p(:, 1) = ones (n,1); P (:, 1) = ones (n,1); for j = 1:15    p (:, 2*j) = sin (j/2*x); p (:, 2*j+1) = cos (j/2*x);    P (:, 2*j) = sin (j/2*x); P (:, 2*j+1) = cos (j/2*x), Endt = p\y;% basic least squares f = p*t;% partial space constrained least squares, design matrix = design Matrix * Orthogonal projection matrix Pt2 = (P*diag ([Ones (1,11) zeros (1,20)]) \ Y F2 = p*t2;figure (1); Clf;hold On;axis ([-3 3-1 2]);p lot (x,f, ' G ');p lot (x, y, ' bo ');p lot (x,f2, ' R ');


In fact, it can be seen that the least squares of some spatial constraints use only part of the parameter space, and the P-matrices are very difficult to set up, and the least squares of L2 constraints are introduced later.





Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.

Least squares Learning One

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.