Deep Learning: II (linear regression practice)

Source: Internet
Author: User

Transferred from: http://www.cnblogs.com/tornadomeet/archive/2013/03/15/2961660.html

Preface

This is the practice of multivariate linear regression, which is practiced in the simplest two-dollar linear regression, referring to the Stanford University's teaching network http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course= Deeplearning&doc=exercises/ex2/ex2.html. The subject is given 50 data sample points, of which X is the age of the 50 children, aged 2 to 8 years old, the age can be presented in decimal form. Y for the 50 children of the corresponding height, of course, is also expressed in decimal form. The problem now is to estimate the height of children aged 3.5 and 7, based on these 50 training samples. By drawing out the distribution of the training sample points, it can be intuitively found that this is a typical linear regression problem.

  Introduction to MATLAB functions:

Legend

For example, Legend (' Training data ', ' Linear regression '), which represents the meaning of each curve marker in the image, where the first curve of the image (actually a discrete point) represents the training sample data, The second curve (actually a straight line) represents a regression curve.

Hold on, hold off:

Hold on refers to the opening of the drawing paper in the case of the previous picture, allowing the curve to continue to be drawn. Hold off refers to the drawing of the previous painting.

Linspace:

For example Linspace (-3, 3, 100) refers to the 3 to 3 of the 100 number, uniform selection, that is, linear selection.

Logspace:

For example, Logspace (-2, 2, 15), refers to the 10^ (-2) to 10^ (2) between the selection of 15 numbers, these numbers according to the exponential size of the selection, that is, the exponent part is evenly selected, but because all of the 10 is the base of the exponent, so the final is to obey the exponential distribution selected.

  Experimental Results:

Training sample scatter and regression curve prediction graphs:

  

The surface graph between the loss function and the parameter:

  

Contour plot of loss function:

  Program code and comments:

The normal equations method is used to solve:

Percent method one
x = Load (' Ex2x.dat ');
y = Load (' Ex2y.dat ');
Plot (x, y, ' * ')
xlabel (' height ')
ylabel (' age ')
x = [Ones (size (x), 1), x];
W=INV (x ' *x) *x ' *y hold
on%plot (x,0.0639*x+0.7502) 
plot (x (:, 2), 0.0639*x (:, 2) +0.7502)% corrected code

Using gradient descend Process solution:

% Exercise 2 Linear Regression% Data is roughly based on $ CDC growth figures% for boys% x refers to a boy's age % y is a boy's height in meters percent clear all; Close all; CLC x = Load (' Ex2x.dat ');

y = Load (' Ex2y.dat '); m = Length (y); % number of training examples% Plot the training data figure;
% open a new Figure window plot (x, y, ' o '); Ylabel (' Height in Meters ') xlabel (' Age in Years ')% Gradient descent x = [Ones (M, 1) x];

% Add a column of ones to x theta = zeros (Size (x (1,:))) ';% Initialize fitting parameters Max_itr =; alpha = 0.07;  For num_iterations = 1:max_itr% This is a vectorized version of the% gradient descent update Formula% It ' s Also fine to use the summation formula from the videos, here is the gradient grad = (1/m). * x ' * ((x * thet
    
    A)-y);
    
    % here is the actual update theta = theta-alpha. * GRAD; % sequential update:the wrong-do gradient descent% Grad1 = (1/m). * X (:, 1) ' * ((x * theta)-y);
    % theta (1) = theta (1) + alpha*grad1;
    % Grad2 = (1/m). * X (:, 2) ' * ((x * theta)-y);
% theta (2) = Theta (2) + alpha*grad2; End% print theta to screens theta% Plot The linear fit hold on;  % keep previous plot visible plot (x (:, 2), X*theta, '-') Legend (' Training data ', ' Linear regression ')% mark the meaning of each curve marker in the image hold Off% don ' t overlay any more plots on the this figure, referring to the front of the picture% Closed form solution for reference% you'll learn about
This method in the future videos Exact_theta = (x ' * x) \x ' * y% Predict values for age 3.5 and 7 Predict1 = [1, 3.5] *theta Predict2 = [1, 7] * theta% Calculate J matrix% Grid over which we'll Calculate J theta0_vals = Linspace (-3, 3, 100)
;

Theta1_vals = Linspace (-1, 1, 100);

% initialize j_vals to a matrix of 0 ' s j_vals = zeros (Length (theta0_vals), Length (theta1_vals));    
      For i = 1:length (theta0_vals) for j = 1:length (theta1_vals) t = [Theta0_vals (i); Theta1_vals (j)]; J_vals (i,j) = (0.5/m). * (x * t-y) ' * (x * t-y); End end% Because of the meshgrids work in the surf command, we need to% transpose j_vals before calling surf, or E
LSE the axes would be flipped j_vals = J_vals ';
% Surface plot figure; Surf (theta0_vals, Theta1_vals, j_vals) xlabel (' \theta_0 ');

Ylabel (' \theta_1 ');
% Contour plot figure; % Plot j_vals as contours spaced logarithmically between 0.01 and Contour (theta0_vals, Theta1_vals, J_vals, Logspac E (-2, 2, 15))% draw the contour line Xlabel (' \theta_0 '); Ylabel (' \theta_1 ');% is similar to an escape character, but can only be a parameter 0~9

  References:

http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=DeepLearning&doc=exercises/ex2/ Ex2.html

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.