This semester has been to follow up on the Coursera Machina learning public class, the teacher Andrew Ng is one of the founders of Coursera, machine learning aspects of Daniel. This course is a choice for those who want to understand and master machine learning. This course covers some of the basic concepts and methods of machine learning, and the programming of this course plays a huge role in mastering these concepts and methods.
Course Address https://www.coursera.org/learn/machine-learning
The main notes are a summary of the course content, as well as MATLAB programming work ....
Neural Networks
WEEK4 Programming Job Core Code
Nncostfunction.m
1%First, the y of the original label represents the output of the vector pattern.2Y_vect = zeros (m,num_labels); %5000x103 fori =1: M,4Y_vect (I,y (i)) =1; 5 end; 6 7a1 = [Ones (M,1) X];8z2 = A1 * Theta1';9A2 = sigmoid (z2); % theX -TenA2 = [Ones (M,1) A2]; % theX - Onez3 = A2 * Theta2'; % X × Ten AA3 = sigmoid (Z3); % theXTen - - fori =1: M thej = j +sum(-1*y_vect (i,:). *log (A3 (I,:))-(1-y_vect (i,:)). *log (1-A3 (I,:))); - End -J = j/m; -j = j + lambda* (sum(sum(Theta1 (:,2: End). ^2))+sum(sum(Theta2 (:,2: End). ^2)))/2/m; + -%Backward Propagation +Delta1 = zeros (Size (Theta1)); %25x401 ADelta2 = zeros (Size (THETA2)); %0x26 at forI=1: M -DELTA3 = A3 (i,:)'-Y_vect (i,:)'; %0x1 -TEMPTHETA2 = Theta2'* DELTA3;% 26x10x10x1 = 26x1 -Delta2 = TempTheta2 (2: End). * Sigmoidgradient (Z2 (i,:)'); %25x1 -Delta2 = Delta2 + delta3 * A2 (i,:); %10x1x1x26 -Delta1 = Delta1 + delta2 * A1 (I,:); %25x1x1x401 in end; - toTheta2_grad = delta2/L; +Theta1_grad = delta1/m; - the%regularization Gradient * $Theta2_grad (:,2: End) = Theta2_grad (:,2: end) + lambda * THETA2 (:,2: End)/m; Panax NotoginsengTheta1_grad (:,2: End) = Theta1_grad (:,2: end) + lambda * THETA1 (:,2: End)/m; -
Sigmoidgradient.m
1 1 -sigmoid (z));
Andrew Ng's Machine Learning course Learning (WEEK4) Multi-Class classification and neural Networks