**Preface**

This article is a multi-linear regression exercise, here is the most simple binary linear regression, refer to the Stanford University Teaching Network http://openclassroom.stanford.edu/MainFolder/DocumentPage.php? Course = deeplearning & Doc = exercises/ex2/ex2.html. This topic provides 50 sample data points. X indicates the age of the 50 children. The age ranges from 2 to 8. The age can be displayed in decimal places. Y indicates the height of the 50 children, which is expressed in decimal form. The problem now is that the 50 training samples are used to estimate the height of a child at the age of 3.5 and 7. By drawing the distribution of training sample points, intuition shows that this is a typical linear regression problem.

**Matlab function introduction:**

*Legend:*

For example, legend ('training data', 'linear regression ') indicates the meaning of each curve sign in the image, here, the first curve of the image (actually a discrete point) represents the training sample data, and the second curve (actually a straight line) represents the regression curve.

*Hold on, hold off:*

Hold on refers to opening the painting paper in the case of the previous image, allowing you to continue painting the curve above. Hold off refers to the painting of the previous painting.

*Linspace:*

For example, linspace (-3, 3,100) means giving 100 numbers between-3 and 3, even selection, that is, linear selection.

*Logspace:*

For example, logspace (-2, 2, 15) indicates that 15 numbers are selected between 10 ^ (-2) and 10 ^ (2). These numbers are selected based on the exponential size, that is to say, the index is evenly selected, but the base 10 index is used, so the index distribution is selected.

**Experiment results:**

Training sample scatter and regression curve prediction:

Curve between loss function and parameter:

Contour Map of the loss function:

**ProgramCodeAnd notes:**

Use the normal equations method to solve the problem:

% Method 1 x = Load ( ' Ex2x. dat ' ); Y = Load ( ' Ex2y. dat ' ); Plot (x, y, ' * ' ) Xlabel ( ' Height ' ) Ylabel ( ' Age ' ) X = [Ones (SIZE (x ), 1 ), X]; W = Inv (x ' ** X) * x ' * Yhold On % Plot (X, 0.0639 * X + 0.7502 )

Plot (x (:, 2), 0.0639 * X (:, 2) + 0.7502) % corrected code

Use the gradient descend process to solve the problem:

% Exercise 2 Linear Regression % Data Is Roughly based On 2000 CDC growth figures % For Boys % X refers To A boy ' S age % Y Is A boy ' S height in meters % Clear All ; Close All ; Clcx = Load ( ' Ex2x. dat ' ); Y = load ( ' Ex2y. dat ' ); M = Length (y); % Number Of Training examples %Plot the training datafigure; % Open A New Figure windowplot (x, y, ' O ' ); Ylabel ( ' Height in meters ' ) Xlabel ( ' Age in years ' ) %Gradient descentx = [Ones (m, 1 ) X]; % Add a column Of Ones To Xtheta = Zeros (SIZE (x ( 1 ,:))) ' ; % Initialize fitting parameters Max_itr = 1500 ; Alpha = 0.07 ; For Num_iterations =1 : Max_itr % This Is A vectorized version Of The % Gradient Descent update Formula % It ' S also fine to use the summation formula from the videos % Here Is The gradient grad = ( 1 /M). * x ' * (X * theta)-y ); % Here Is The actual update Theta = Theta-Alpha .* Grad; % Sequential update: the wrong way To Do Gradient Descent % Grad1 = ( 1 /M). * X (:, 1 ) ' * (X * theta)-y ); % Theta ( 1 ) = Theta ( 1 ) + Alpha * Grad1; % Grad2 = (1 /M). * X (:, 2 ) ' * (X * theta)-y ); % Theta ( 2 ) = Theta ( 2 ) + Alpha * Grad2; End % Print Theta To Screentheta % Plot the linear fithold On ; % Keep previous plot visibleplot (x (:, 2 ), X * Theta, ' - ' ) Legend ( ' Training data ' , ' Linear Regression ' ) % Mark the meaning of each curve sign in the image. Hold off % Don ' T overlay any more plots on this figure refers to turning off the previous figure % Closed form SolutionFor Reference % You will learn about this method In Future videosexact_theta = (X ' * X) \ x ' * Y % Predict values For Age 3.5 And 7 Predict1 = [ 1 ,3.5 ] * Thetapredict2 = [ 1 , 7 ] * Theta % Calculate J matrix % Grid over which we will calculate jtheta0_vals = Linspace (- 3 , 3 , 100 ); Theta1_vals = Linspace (- 1 , 1 ,100 ); % Initialize j_vals To A matrix Of 0 ' S J_vals = Zeros (length (theta0_vals), length (theta1_vals )); For I = 1 : Length (theta0_vals) For J = 1 : Length (theta1_vals) T =[Theta0_vals (I); theta1_vals (j)]; j_vals (I, j) = ( 0.5 /M). * (x * t-y) ' * (X * t-y ); End End % Because Of The way meshgrids work In The surf command, we need To % Transpose j_vals before calling surf, Or Else The axes will be flippedj_vals = J_vals' ; % Surface plotfigure; SURF (theta0_vals, theta1_vals, j_vals) xlabel ( ' \ Theta_0 ' ); Ylabel ( ' \ Theta_1 ' ); % Contour plotfigure; % Plot j_vals 15 Contours spaced logarithmically 0.01 And 100 Contour (theta0_vals, theta1_vals, j_vals, logspace ( - 2 , 2 , 15 ) % Draw the contour line xlabel ( ' \ Theta_0 ' ); Ylabel ( ' \ Theta_1 ' ); % Is similar to the escape character, but can only be 0 ~ 9

**References:**

Http://openclassroom.stanford.edu/MainFolder/DocumentPage.php? Course = deeplearning & Doc = exercises/ex2/ex2.html