Talking about the single-line regression, multi-linear regression, logistic regression and so on in NG video

Source: Internet
Author: User

Tomorrow the first class 8.55 only, or the things you see today to tidy up.

Today is mainly to see Ng in the first few chapters of the single-line regression, multi-linear regression, logistic regression of the MATLAB implementation, before thought those things understand well, but write code is very difficult to look, but today, Daniel's code found really easy ... But it is a very skillful use of the matrix to achieve.

For example, the single-line regression inside the j=0 and j=1 The two cases, the direct conversion of x to x = [ones (M, 1) x], the first column is all 1, just can be j=0 when the x=1 into the operation, this way gradient grad = (1/m). * x ' * ((x * theta)-y), theta = theta-alpha. * Grad, there is a loop outside to find theta0 and Theta 1 (actually all in the Theta matrix).

Clear all; Close all; CLCX = Load (' Ex2x.dat '); y = Load (' Ex2y.dat '); m = length (y); % number of training examples% Plot the training datafigure; % open a new figure windowplot (x, y, ' o '); Ylabel (' Height in Meters ') xlabel (' Age in Years ')% Gradient descentx = [Ones (M, 1 ) x]; % Add a column of ones to Xtheta = zeros (Size (x (1,:))) ';% Initialize fitting parametersmax_itr = 1500;alpha = 0.07;for nu M_iterations = 1:max_itr% This is a vectorized version of the gradient descent update formula% It ' s also fin  E to using the summation formula from the videos the derivative of the cost function of the gradient%: it is just the case that the j=0 and the j=1 are cleverly grouped into a formula using the Matrix method%grad    = 1/m * (h-y) or 1/m * (h-y) *x grad = (1/m). * x ' * ((x * theta)-y);    % here is the actual update theta = theta-alpha. * GRAD;    % sequential update:the wrong-do gradient descent% Grad1 = (1/m). * X (:, 1) ' * ((x * theta)-y);    % theta (1) = theta (1) + alpha*grad1;    % Grad2 = (1/m). * X (:, 2) ' * ((x * theta)-y); % theta (2) = Theta (2) + alpha*grad2;end% Print theta to screentheta% Plot the linear fithold on; % Keep previous plot visibleplot (X (:, 2), X*theta, '-') Legend (' Training data ', ' Linear regression ')% mark the meaning of each curve marker in the image hold Off% don ' t overlay any more plots on the this figure, which means to turn off the front image

  

There is the Newton method directly to solve parameters, a sentence w=inv (x ' *x) *x ' *y directly to find theta0 and Theta 1, it seems to need to look at the optimization of this aspect of things ah.

  

The Percent method one x = Load (' Ex2x.dat '); y = load (' ex2y.dat ');p lot (x, y, ' * ') xlabel (' height ') ylabel (' age ') x = [Ones (Size (x,1), 1), x];w =INV (x ' *x) *x ' *yhold on%plot (x,0.0639*x+0.7502) plot (x (:, 2), 0.0639*x (:, 2) +0.7502)% corrected code

The latter is the three-dimensional diagram of the cost function, a surf function can draw a three-dimensional image directly.

Refer to: http://huzhyi21.blog.163.com/blog/static/1007396201061052214302/

% Calculate J matrix% Grid over which we'll Calculate jtheta0_vals = Linspace ( -3, 3, +); theta1_vals = Linspace (-1, 1, % initialize j_vals to a matrix of 0 ' sj_vals = zeros (Length (theta0_vals), Length (theta1_vals)), and for i = 1:length (theta 0_vals) for      j = 1:length (theta1_vals)      t = [Theta0_vals (i); Theta1_vals (j)];          J_vals (i,j) = (1/2*m). * (x * t-y) ' * (x * t-y);      %j_vals is the cost function: 1/2m * (h-y) ^2    endend% Because of the way meshgrids work on the surf command, we need to% transpose j_ Vals before calling surf, or else the axes would be flippedj_vals = J_vals ';% Surface Plotfigure;surf (theta0_vals, Theta1_v ALS, J_vals) xlabel (' \theta_0 '); Ylabel (' \theta_1 ');

  

Talking about the single-line regression, multi-linear regression, logistic regression and so on in NG video

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.