Neural Network jobs: NN Learning Coursera machine learning (Andrew Ng) WEEK 5

Source: Internet
Author: User

In Week 5, the job requires supervised learning (suoervised learning) to recognize Arabic numerals through a neural network (NN) for multi-classification logistic regression (multi-class logistic REGRESSION). The main purpose of the job is to feel how to find the cost function in the NN and the derivative value of each parameter (THETA) in its hypothetical function (GRADIENT derivative) (using backpropaggation).

The difficulty is not high, but the problem is that you have to get used to the Malab matrix Qaq, as a konjac konjac, I've got the dog. The code of the following generation core section wants to give some help to the students who are stuck in the job. But please don't copy the code OH ~ No ~ no ~

1Ty =zeros (m, num_labels);2 3  forI=1: M4      forj=1: Num_labels5         ifY (i) = =J6Ty (i,j) =1;7 End8 End9 EndTen  OneA1 =X; Aa1 = [Ones (size (A1,1),1) A1]; -z2 = A1 * Theta1'; -A2 =sigmoid (z2); theA2 = [Ones (size (A2,1),1) A2]; -z3 = A2 * Theta2'; -A3 =sigmoid (Z3); -  +  -  forI=1: M +      forj=1: Num_labels AJ = J-log (1-A3 (I,J)) * (1-ty (I,J))/m-log (A3 (I,j)) *ty (I,J)/m; at End - End -  -%size (J,1) -%size (J,2) -      ind3 = A3-Ty; -D2 = (D3 * THETA2 (:,2: End)). *sigmoidgradient (z2); toTheta1_grad = Theta1_grad + d2'*a1/m; +Theta2_grad = Theta2_grad + d3'*a2/m; -  the% ------------------------------------------------------------- *jj=0; $ Panax Notoginseng   forI=1: Size (Theta1,1) -      forj=2: Size (Theta1,2) theJJ = JJ + Theta1 (i,j) *theta1 (i,j) *lambda/(m*2); + End A End theSize (Theta1,1); +Size (Theta1,2); -   $   forI=1: Size (THETA2,1) $         forj=2: Size (THETA2,2) -JJ = JJ + Theta2 (i,j) *theta2 (i,j) *lambda/(2*m); - End the End -Size (THETA2,1);WuyiSize (THETA2,2); the%J = J + (lambda/(2*M)) * (Theta1 (:,2: End). *THETA1 (:,2: End) +theta2 (2: End,:). *THETA2 (2: End,:)); -J =j+JJ; Wu  -Theta1_gradd =Zeros (Size (Theta1)); AboutTheta2_gradd =Zeros (Size (THETA2)); $  -  forI=2: Size (Theta1,2) -      forj=1: Size (Theta1,1) -Theta1_gradd (j,i) = Theta1 (j,i) *lambda/m; A End + End the  -  forI=2: Size (THETA2,2) $      forj=1: Size (THETA2,1) theTheta2_gradd (j,i) = Theta2 (j,i) *lambda/m; the End the End the  -Theta1_grad = theta1_gradd+Theta1_grad; inTheta2_grad = Theta2_gradd+theta2_grad;

PS: Bo Master konjac konjac force himself next time to write matrix operations, can not set cycle!!!

Neural Network jobs: NN Learning Coursera machine learning (Andrew Ng) WEEK 5

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.