Neural network and deep learning programming exercises (Coursera Wunda) (3)

Source: Internet
Author: User
Tags db2

full implementation of multi-layered neural network recognition picture of the cat
Original Coursera Course homepage, in the NetEase cloud classroom also has the curriculum resources but no programming practice.
This program uses the functions completed in the last job, fully implementing a multilayer neural network, and training to identify whether there is a cat in the picture. There is no comment in the
Code and Training test data download
Code, but the explanation in Coursera is very detailed.

Import time import NumPy as NP import h5py import matplotlib.pyplot as PLT import scipy from PIL import Image from Scip Y import ndimage from dnn_app_utils_v2 import * plt.rcparams[' figure.figsize '] = (5.0, 4.0) # Set default size of plots p lt.rcparams[' image.interpolation ' = ' nearest ' plt.rcparams[' image.cmap '] = ' Gray ' np.random.seed (1) Train_x_orig, Train_y,test_x_orig,test_y,classes=load_dataset () "Index=11 plt.imshow (Train_x_orig[index]) print (' y= ' +str (

Train_y[0,index]) + ", It ' s A" +classes[train_y[0,index]].decode (' utf-8 ') + ' picture. ') Plt.show () "M_train=train_x_orig.shape[0" num_px=train_x_orig.shape[1] m_test=test_x_orig.shape[0] Print ("number of training Examples: "+ str (m_train)) print (" Number of testing Examples: "+ str (m_test)) print (" Each image is of size  : ("+ str (NUM_PX) +", "+ str (NUM_PX) +", 3) ") Print (" Train_x_orig shape: "+ str (train_x_orig.shape)) print (" train_y Shape: "+ str (train_y.shape)) print (" Test_x_orig shape: "+ str (test_x_orig.shape) Print ("test_y shape:" + str (test_y.shape)) Train_x_flatten=train_x_orig.reshape (train_x_orig.shape[0],-1). T Test_x_flatten=test_x_orig.reshape (test_x_orig.shape[0],-1). T train_x=train_x_flatten/255 test_x=test_x_flatten/255 print (' Train_x.shape: ' +str (Train_x.shape)) print (' Test_ X.shape: ' +str (Test_x.shape)) n_x=train_x.shape[0] n_h=7 n_y=1 layers_dims= (n_x,n_h,n_y) def two_layer_model (x, Y,
    Layers_dims,learning_rate=0.0075,num_iterations=3000,print_cost=false): Np.random.seed (1) grads={} costs=[]
    M=X.SHAPE[1] (n_x,n_h,n_y) =layers_dims parameters=initialize_parameters (n_x,n_h,n_y) W1=parameters[' W1 '] b1=parameters[' B1 ' w2=parameters[' W2 '] b2=parameters[' B2 '] for I in range (0,num_iterations): A1,c Ache1=linear_activation_forward (x,w1,b1, ' Relu ') a2,cache2=linear_activation_forward (a1,w2,b2, ' sigmoid ') c Ost=compute_cost (a2,y) da2=-(Np.divide (Y,A2)-np.divide (1-Y,1-A2)) Da1,dw2,db2=linear_activaTion_backward (da2,cache2, ' sigmoid ') da0,dw1,db1=linear_activation_backward (da1,cache1, ' Relu ') grads[' dW1 ' ]=DW1 grads[' db1 ']=db1 grads[' dW2 ']=dw2 grads[' DB2 ']=db2 parameters=update_parameters (par
        ameters,grads,learning_rate) w1=parameters["W1"] b1=parameters[' B1 '] w2=parameters[' W2 '] b2=parameters[' B2 ' If Print_cost and I%100==0:print ("cost after iteration {}:{}". Format (I,np.squeez e)) If Print_cost and I%100==0:costs.append (cost) Plt.plot (Np.squeeze (costs)) Plt.ylabe
    L (' cost ') Plt.xlabel (' iterations (per tens) ') plt.title (' Learning rate = ' +str (learning_rate)) Plt.show () Return parameters layers_dims=[12288,20,7,5,1] def l_layer_model (x,y,layers_dims,learning_rate=0.0075,num_ Iterations=3000,print_cost=false): Np.random.seed (1) costs=[] Parameters=initialize_parameters_deep (layers_di MS) for I in range (0,num_iteratiONS): Al,caches=l_model_forward (x,parameters) cost=compute_cost (al,y) Grads=l_model_backward (AL,Y,
            Caches) Parameters=update_parameters (parameters,grads,learning_rate) if Print_cost and i%100==0: Print ("Cost after iteration%i:%f"% (i,cost)) if Print_cost and I%100==0:costs.append (cost) PLT . Plot (Np.squeeze (costs)) Plt.ylabel (' cost ') Plt.xlabel (' iterations (per tens) ') plt.title (' Learning rate = ' +s TR (learning_rate)) plt.show () return Parameters parameters = L_layer_model (train_x, train_y, Layers_dims, Num_it Erations = 2500, Print_cost = True) pred_train=predict (train_x,train_y,parameters) pred_test=predict (test_x,test_y,  Parameters) Print_mislabeled_images (classes, test_x, test_y, pred_test) my_image = "My_image.jpg" # Change the Name of your image file My_label_y = [1] # The True class of your image (1-cat, 0-non-cat) # END CODE here # # fname = "images/" + My_imagE image = Np.array (Ndimage.imread (fname, flatten=false)) My_image = scipy.misc.imresize (image, Size= (NUM_PX,NUM_PX)). Reshape ((num_px*num_px*3,1)) My_predicted_image = Predict (My_image, my_label_y, parameters) plt.imshow (image) Print (" y = "+ str (np.squeeze (my_predicted_image)) +", your l-layer model predicts a \ "" + Classes[int (Np.squeeze (my_predicted_i
Mage)),].decode ("utf-8") + "\" picture. ") Plt.show ()

The

Dnn_app_utils_v2 and test training data are in the download link.
Run Result:
As the number of training times increases, the cost loss function drops


Test set to identify the wrong picture:

You can test your pictures:

Test results:

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.