Machine learning-Reverse propagation algorithm (BP) code implementation (MATLAB)

Source: Internet
Author: User

Percent Machine learning Online class-exercise 4 neural Network learning% instructions%------------% This file contains Co De that helps you get started on the% linear exercise. You'll need to complete the following functions% of this exericse:%% sigmoidgradient.m% randinitializeweights.m% nncost function.m%% for the exercise, you'll not need to the change any code in this file,% or any other files other than those me ntioned above.%%% initializationclear; Close all; Clc
%% Setup the parameters you'll use for this exerciseinput_layer_size = 20x Input Images of digitshidden_layer_size = 25; % hidden unitsnum_labels = 10; % labels, from 1 to ten (note that we had mapped "0" to label) percent =========== part 1:loading and visualizing Data =============% We Start the exercise by first loading and visualizing the dataset. % you'll be working with a datasets that contains handwritten digits.%% Load Training datafprintf (' Loading and Visualizin G Data ... \ n ') load (' Ex4data1.mat '); m = Size (x, 1);% randomly select Data points to Displaysel = randperm (Size (x, 1)); s    el = sel (1:100); 
Sel (1:12345 ...  -
explain

A = X (sel,:);
1 ....... 2 ....... 3 ....... 4 ....... 5 .......... .....
explain

Displaydata (sel,:)); fprintf (' program paused. Press ENTER to continue.\n ');p ause;%% ================ part 2:loading Parameters ================% in this part of the ex Ercise, we load some pre-initialized% neural network parameters.fprintf (' \nloading Saved neural network parameters ... \ n ' )% Load the weights into variables Theta1 and theta2load (' ex4weights.mat ');% unroll parameters nn_params = [Theta1 (:); Theta2 (:)];
HTTPS://www.cnblogs.com/liu-wang/p/9466123.html
explain

================ part 3:compute Cost (feedforward) ================% to the neural network, your should first start by Implementing The% Feedforward part of the neural network that returns the cost is only. you% should complete, the code in NNCOSTFUNCTION.M to return cost. after% implementing the Feedforward to compute the cost, you can verify that% your implementation are correct by verifying That you get the same cost% as us for the fixed debugging parameters.%% We suggest implementing the Feedforward cost *with out* regularization% first So, it'll be easier for your to debug. Later, in Part 4, you% 'll get to implement the regularized cost.%fprintf (' \nfeedforward Using neural Network ... \ n ')% We ight regularization parameter (we set this to 0 here). Lambda = 0; J = Nncostfunction (Nn_params, Input_layer_size, Hidden_layer_size, ... num_labels, X, y, Lambda); fprintf ([' cost at Parameters (loaded from ex4weights):%f ' ... ' \ n (this value should is about 0.287629) \ n '], J); fprintf (' \nprogram Paused. Press ENTER to continue.\n ');p ause;%% =============== part 4:implement regularization ===============% Once your ction implementation is correct, you should now% continue to implement the regularization with the cost.%fprintf (' \nchecki Ng cost Function (w/regularization) ... \ n ')% Weight regularization parameter (we set this to 1 here). lambda = 1; J = Nncostfunction (Nn_params, Input_layer_size, Hidden_layer_size, ... num_labels, X, y, Lambda); fprintf ([' cost at Parameters (loaded from ex4weights):%f ' ... ' \ n (this value should is about 0.383770) \ n '], J); fprintf (' program paused. Press ENTER to continue.\n ');p ause;%% ================ part 5:sigmoid Gradient ================% before you start Impleme Nting The neural network, you'll first% implement the gradient for the sigmoid function. You should complete the% code in the SIGMOIDGRADIENT.M file.%fprintf (' \nevaluating sigmoid gradient...\n ') g = Sigmoidgradi ENT ([1-0.5 0 0.5 1]); fprintf (' Sigmoid gradient evaluated at [1-0.5 0 0.5 1]:\n '), fprintf ('%f ', g), fprintf (' \ n '); fprintf (' program paused.  Press ENTER to continue.\n ');p ause;%% ================ part 6:initializing pameters ================% Exercise, you'll be starting to implment a two% layer neural network that classifies digits. You'll start by% Implementing a function to initialize the weights of the neural network% (RANDINITIALIZEWEIGHTS.M) Fprin TF (' \ninitializing neural Network Parameters ... \ n ') initial_theta1 = Randinitializeweights (Input_layer_size, Hidden_ layer_size); initial_theta2 = Randinitializeweights (hidden_layer_size, num_labels);% unroll Parametersinitial_nn_ params = [Initial_theta1 (:); Initial_theta2 (:)];%% =============== part 7:implement backpropagation ===============% on Ce your cost matches up with ours, you should proceed to implement the% backpropagation algorithm for the neural network. You should add to The% code "ve written in nncostfunction.m to return the partial% derivatives of the Parameters.%fprintf (' \nchecking backpropagation ... \ n ');% Check gradients by running checknngradientschecknngradients; fprintf (' \nprogram paused. Press ENTER to continue.\n ');p ause;%% =============== part 8:implement regularization ===============% Once your Backprop Agation implementation is correct, you should now% continue to implement the regularization with the cost and GRADIENT.%FP rintf (' \nchecking backpropagation (w/regularization) ... \ n ')% Check gradients by running CHECKNNGRADIENTSLAMBDA = 3;chec Knngradients (lambda);% Also output the Costfunction debugging Valuesdebug_j = Nncostfunction (Nn_params, Input_layer_  Size, ... hidden_layer_size, num_labels, X, y, Lambda), fprintf ([' \n\ncost at (fixed) debugging parameters (W/lambda = 10): %f ' ... ' \ n (this value should is about 0.576051) \ n '], debug_j); fprintf (' program paused. Press ENTER to continue.\n ');p the ause;%% =================== part 8:training NN ===================% you have now Implemente D all the code necessary to train a neural%Network. To train your neural network, we'll now use "FMINCG", which% are a function which works similarly to "Fminunc". Recall that these% advanced Optimizers is able to train our cost functions efficiently as% long as we provide them with T He gradient computations.%fprintf (' \ntraining neural Network ... \ n ')% after you had completed the assignment, change the Maxiter to a larger% value to see how more training Helps.options = optimset (' maxiter ', +);% You should also try Differen T values of Lambdalambda = 1;% Create "Short hand" for the cost function to be minimizedcostfunction = @ (P) nncostfunction (p, ... input_layer_size, ... hidden_layer_size, ... num_labels, X, y, Lambda);% now, costfunction are a function that takes I  n only one argument (the% Neural network parameters) [Nn_params, Cost] = FMINCG (costfunction, Initial_nn_params, options);% Obtain Theta1 and Theta2 back from nn_paramstheta1 = Reshape (Nn_params (1:hidden_layer_size * (input_layer_size + 1)), ... . Hidden_layer_size, (InpuT_layer_size + 1)); Theta2 = Reshape (Nn_params ((1 + (Hidden_layer_size * (input_layer_size + 1)): End), ... num_labels, (hidden_layer_size + 1) ); fprintf (' program paused. Press ENTER to continue.\n ');p ause;%% ================= part 9:visualize Weights =================% You can now "Visualiz E "What's The neural network is learning by% displaying the hidden units to see what features they was capturing in% the D ata.fprintf (' \nvisualizing neural Network ... \ n ') Displaydata (Theta1 (:, 2:end)); fprintf (' \nprogram paused. Press ENTER to continue.\n ');p ause;%% ================= part 10:implement Predict =================% after training the n Eural Network, we would like to use it to predict% the labels. You'll now implement the ' predict ' function to use the% neural network to predict the labels of the training set. This lets% compute the training set accuracy.pred = Predict (Theta1, THETA2, X); fprintf (' \ntraining set accuracy:%f\n '  , mean (double (pred = = y)) * 100);

  

Machine learning-Reverse propagation algorithm (BP) code implementation (MATLAB)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.