Example of an artificial neural network algorithm implemented by Python [Based on the back propagation algorithm], python Artificial Neural Network

Source: Internet
Author: User

Example of an artificial neural network algorithm implemented by Python [Based on the back propagation algorithm], python Artificial Neural Network

This example describes the artificial neural network algorithm implemented by Python. We will share this with you for your reference. The details are as follows:

Note: This program is written in Python3. You need to install the numpy toolkit for matrix operations. You have not tested whether python2 can be run.

This program implements the inverse Propagation Algorithm Used in machine learning to train the artificial neural network. For the theoretical part, see my reading notes.

In this program, the target function is composed of an input x and two output y,
X is the real number randomly generated between the range [-3.14, 3.14], and the two y values correspond to y1 = sin (x), y2 = 1 respectively.

10 thousand training samples are randomly generated. After network learning and training, the training results are verified using five randomly generated test data.

Adjust the learning rate of the algorithm, the number of hidden layers, the size of hidden layers, and train a new network. We can observe the influence of parameters on the learning results.

The algorithm code is as follows:

#! Usr/bin/env python3 #-*-coding: UTF-8-*-import numpy as npimport math # definition of sigmoid funtion # numpy. exp work for arrays. def sigmoid (x): return 1/(1 + np. exp (-x) # definition of sigmoid derivative funtion # input must be sigmoid function's resultdef sigmoid_output_to_derivative (result): return result * (1-result) # init training setdef getTrainingSet (nameOfSet): setDict = {"sin": getSinSet ()} Return setDict [nameOfSet] def getSinSet (): x = 6.2 * np. random. rand (1)-3.14 x = x. reshape (1, 1) # y = np. array ([5 * x]). reshape (1, 1) # y = np. array ([math. sin (x)]). reshape (1, 1) y = np. array ([math. sin (x), 1]). reshape (1, 2) return x, ydef getW (synapse, delta): resultList = [] # traverse the weights of each hidden unit on each output in the hidden layer, for example, eight hidden units, each Hidden Unit has two weights for the two outputs for I in range (synapse. shape [0]): resultList. append (synapse [I,:] * delta ). Sum () resultArr = np. array (resultList ). reshape (1, synapse. shape [0]) return resultArrdef getT (delta, layer): result = np. dot (layer. t, delta) return resultdef backPropagation (trainingExamples, etah, input_dim, output_dim, hidden_dim, hidden_num): # feasible condition if hidden_num <1: print ("hidden layers must be less than 1 ") return # initialize the network weight matrix. This is the core synapseList = [] # The input layer and the hidden layer 1 synapseList. append (2 * np. random. random (input_dim, hidden _ Dim)-1) # Hidden Layer 1 and Hidden Layer 2, 2-> 3, n-1-> n for I in range (hidden_num-1): synapseList. append (2 * np. random. random (hidden_dim, hidden_dim)-1) # The Hidden Layer n and the output layer synapseList. append (2 * np. random. random (hidden_dim, output_dim)-1) iCount = 0 lastErrorMax = 99999 # while True: for I in range (10000): errorMax = 0 for x, y in trainingExamples: iCount + = 1 layerList = [] # Forward Propagation layerList. append (sigmoid (np. dot (x, SynapseList [0]) for j in range (hidden_num): layerList. append (sigmoid (np. dot (layerList [-1], synapseList [j + 1]) # for each output unit k in the network, calculate its error item deltaList = [] layerOutputError = y-layerList [-1] # convergence condition errorMax = layerOutputError. sum () if layerOutputError. sum ()> errorMax else errorMax deltaK = sigmoid_output_to_derivative (layerList [-1]) * layerOutputError deltaList. append (deltaK) iLength = len (synaps EList) for j in range (hidden_num): w = getW (synapseList [iLength-1-j], deltaList [j]) delta = sigmoid_output_to_derivative (layerList [iLength-2-j]) * w deltaList. append (delta) # update each network weight w (ji) for j in range (len (synapseList)-1, 0,-1 ): t = getT (deltaList [iLength-1-j], layerList [J-1]) synapseList [j] = synapseList [j] + etah * t = getT (deltaList [-1], x) synapseList [0] = synapseList [0] + etah * T print ("maximum output error:") print (errorMax) if abs (lastErrorMax-errorMax) <0.0001: print ("converged ") print ("####################") break lastErrorMax = errorMax # tested and trained network for I in range (5): xTest, yReal = getSinSet () layerTmp = sigmoid (np. dot (xTest, synapseList [0]) for j in range (1, len (synapseList), 1): layerTmp = sigmoid (np. dot (layerTmp, synapseList [j]) yTest = layerTmp print ("x:") print (xTest) print (" Actual y: ") print (yReal) print (" y: "output by Neural Network) print (yTest) print (" final output error: ") print (np. abs (yReal-yTest) print ("######################") print ("iterations: ") print (iCount) if _ name _ = '_ main _': import datetime tStart = datetime. datetime. now () # training sample nameOfSet = "sin" x, y = getTrainingSet (nameOfSet) # setting of parameters # The learning rate is set here. Etah = 0.01 # hidden layers hidden_num = 2 # size of the network input layer input_dim = x. shape [1] # size of the hidden layer hidden_dim = 100 # size of the output layer output_dim = y. shape [1] # build training example trainingExamples = [] for I in range (10000): x, y = getTrainingSet (nameOfSet) trainingExamples. append (x, y) # Start to train the network backPropagation (trainingExamples, etah, input_dim, output_dim, hidden_dim, hidden_num) tEnd = datetime using the back propagation algorithm. datetime. now () print ("time cost:") print (tEnd-tStart)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.