Wunda machinelearning Fifth week after class practice code

Source: Internet
Author: User

Wunda machinelearning Fifth week after class practice code overview

This week, we focused on the reverse propagation algorithm (backpropagation algorithm), which calculates the partial derivative value of the cost function to the variable theta in the neural network. This week's main content is to implement the algorithm.

Nncostfunction.m
function [J Grad] = nncostfunction (Nn_params, ... input_layer_size, ...                                   Hidden_layer_size, ... num_labels, ... X, Y, Lambda)%nncostfunction Implements The neural network cost function for a-a-layer%neural Network which performs classification% [J Grad] = Nncostfuncton (Nn_params, Hidden_layer_size, Num_labels, ...% X, y, Lambda) computes the cost and gradient of the neural network. the% parameters for the neural network is "unrolled" into the vector% Nn_params and need to is converted back into the weight matrices.% % The returned parameter grad should be a "unrolled" vector of the% partial derivatives of the neural network.%% reshape Nn_params back into the parameters Theta1 and Theta2, the weight matrices% for our 2 layer neural networkTheta1 = Reshape (Nn_params (1: Hidden_layer_size * (Input_layer_size +1), ... hidden_layer_size, (Input_layer_size +1)); Theta2 = Reshape (Nn_params (1+ (Hidden_layer_size * (Input_layer_size +1)): End), ... num_labels, (Hidden_layer_size +1));% Setup Some useful variablesm = Size (X,1);% need to return the following variables correctlyJ =0; Theta1_grad = zeros (Size (Theta1)); Theta2_grad = zeros (Size (THETA2));% ====================== YOUR CODE here ======================% Remember to add bias unitX = [Ones (M,1) X];% convert y value to 1 * 10 matrixY = zeros (M, num_labels); For i =1: M Y (i, Y (i)) =1; end% forwardpropagationA2 = sigmoid (X * Theta1 '); a2 = [Ones (size (A2,1) ,1) A2]; a3 = sigmoid (A2 * Theta2 '); J = SUM (sum (-Y. * LOG (A3)-(1-Y). * Log (1-A3))/m; Theta1_withoutbias = Theta1 (:,2: End); Theta2_withoutbias = Theta2 (:,2: End);% regularized cost functionj = j + lambda * (SUM (sum (theta1_withoutbias ^2) + SUM (sum (theta2_withoutbias. ^)2))) / (2* m);% backpropagationD1 = zeros (Size (Theta1));d 2 = zeros (size (THETA2)); Theta1_wtbias = Theta1;theta1_wtbias (:,1) =0; Theta2_wtbias = Theta2;theta2_wtbias (:,1) =0; for t =1: M YT = Y (t,:);    a3t = A3 (t,:);    a2t = A2 (t,:);    a1t = X (t,:);    DELTA3 = A3t-yt; DELTA2 = delta3 * Theta2. * (a2t. * (1-a2t)); Delta2 = Delta2 (2: End);    D2 = d2 + delta3 ' * A2T; D1 = d1 + delta2 ' * A1T; End% regularized ThetaTheta1_grad = Theta1_grad + d1/m + lambda * theta1_wtbias/m; Theta2_grad = Theta2_grad + d2/m + lambda * theta2_wtbias/m;% -------------------------------------------------------------% =========================================================================% unroll gradientsGrad = [Theta1_grad (:); Theta2_grad (:)];end

Wunda machinelearning Fifth week after class practice code

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.