Python implements simple neural network algorithms and python neural network algorithms

Source: Internet
Author: User

Python implements simple neural network algorithms and python neural network algorithms

Python implements simple neural network algorithms for your reference. The specific content is as follows:

Python implements L2 Neural Networks

Including the input layer and output layer

import numpy as np  #sigmoid function def nonlin(x, deriv = False):   if(deriv == True):     return x*(1-x)   return 1/(1+np.exp(-x))  #input dataset x = np.array([[0,0,1],        [0,1,1],        [1,0,1],        [1,1,1]])  #output dataset y = np.array([[0,0,1,1]]).T  np.random.seed(1)  #init weight value syn0 = 2*np.random.random((3,1))-1  for iter in xrange(100000):   l0 = x             #the first layer,and the input layer    l1 = nonlin(np.dot(l0,syn0))  #the second layer,and the output layer     l1_error = y-l1    l1_delta = l1_error*nonlin(l1,True)    syn0 += np.dot(l0.T, l1_delta) print "outout after Training:" print l1 
import numpy as np  #sigmoid function def nonlin(x, deriv = False):   if(deriv == True):     return x*(1-x)   return 1/(1+np.exp(-x))  #input dataset x = np.array([[0,0,1],        [0,1,1],        [1,0,1],        [1,1,1]])  #output dataset y = np.array([[0,0,1,1]]).T  np.random.seed(1)  #init weight value syn0 = 2*np.random.random((3,1))-1  for iter in xrange(100000):   l0 = x             #the first layer,and the input layer    l1 = nonlin(np.dot(l0,syn0))  #the second layer,and the output layer     l1_error = y-l1    l1_delta = l1_error*nonlin(l1,True)    syn0 += np.dot(l0.T, l1_delta) print "outout after Training:" print l1 

Here,
L0: input layer

L1: output layer

Syn0: initial weight

L1_error: Error

L1_delta: Error Correction Coefficient

Func nonlin: sigmoid Function

It can be seen that the more iterations, the closer the prediction result to the ideal value, the longer the time consumption.

Python implements a layer-3 neural network

Including the input layer, hidden layer, and output layer

import numpy as np  def nonlin(x, deriv = False):   if(deriv == True):     return x*(1-x)   else:     return 1/(1+np.exp(-x))  #input dataset X = np.array([[0,0,1],        [0,1,1],        [1,0,1],        [1,1,1]])  #output dataset y = np.array([[0,1,1,0]]).T  syn0 = 2*np.random.random((3,4)) - 1 #the first-hidden layer weight value syn1 = 2*np.random.random((4,1)) - 1 #the hidden-output layer weight value  for j in range(60000):   l0 = X            #the first layer,and the input layer    l1 = nonlin(np.dot(l0,syn0)) #the second layer,and the hidden layer   l2 = nonlin(np.dot(l1,syn1)) #the third layer,and the output layer     l2_error = y-l2    #the hidden-output layer error    if(j%10000) == 0:     print "Error:"+str(np.mean(l2_error))    l2_delta = l2_error*nonlin(l2,deriv = True)    l1_error = l2_delta.dot(syn1.T)   #the first-hidden layer error    l1_delta = l1_error*nonlin(l1,deriv = True)    syn1 += l1.T.dot(l2_delta)   syn0 += l0.T.dot(l1_delta) print "outout after Training:" print l2 
import numpy as np  def nonlin(x, deriv = False):   if(deriv == True):     return x*(1-x)   else:     return 1/(1+np.exp(-x))  #input dataset X = np.array([[0,0,1],        [0,1,1],        [1,0,1],        [1,1,1]])  #output dataset y = np.array([[0,1,1,0]]).T  syn0 = 2*np.random.random((3,4)) - 1 #the first-hidden layer weight value syn1 = 2*np.random.random((4,1)) - 1 #the hidden-output layer weight value  for j in range(60000):   l0 = X            #the first layer,and the input layer    l1 = nonlin(np.dot(l0,syn0)) #the second layer,and the hidden layer   l2 = nonlin(np.dot(l1,syn1)) #the third layer,and the output layer     l2_error = y-l2    #the hidden-output layer error    if(j%10000) == 0:     print "Error:"+str(np.mean(l2_error))    l2_delta = l2_error*nonlin(l2,deriv = True)    l1_error = l2_delta.dot(syn1.T)   #the first-hidden layer error    l1_delta = l1_error*nonlin(l1,deriv = True)    syn1 += l1.T.dot(l2_delta)   syn0 += l0.T.dot(l1_delta) print "outout after Training:" print l2 

The above is all the content of this article. I hope it will be helpful for your learning and support for helping customers.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.