Python uses numpy to implement the BP neural network, numpybp

Source: Internet
Author: User

Python uses numpy to implement the BP neural network, numpybp

This article uses numpy to implement a simple BP neural network. Because it is used for regression rather than classification, the incentive function selected at the output layer is f (x) = x. The principle of BP neural network is not described here.

Import numpy as np class NeuralNetwork (object): def _ init _ (self, input_nodes, hidden_nodes, output_nodes, learning_rate): # Set number of nodes in input, hidden and output layers. set the number of nodes in the input, hidden, and output layers. input_nodes = input_nodes self. hidden_nodes = hidden_nodes self. output_nodes = output_nodes # Initialize weights, Initialize weight and learning rate self. weights_input_to_hidden = np. random. normal (0.0, self. hidden_nodes **-0.5, (self. hidden_nodes, self. input_nodes) self. weights_hidden_to_output = np. random. normal (0.0, self. output_nodes **-0.5, (self. output_nodes, self. hidden_nodes) self. lr = learning_rate # the excitation function of the hidden layer is the sigmoid function, and the Activation function is the sigmoid function self. activation_function = (lambda x: 1/(1 + np. exp (-x) def train (self, inputs_list, targets_list): # Convert inputs list to 2d array inputs = np. array (inputs_list, ndmin = 2 ). T # the shape of the input vector is [feature_diemension, 1] targets = np. array (targets_list, ndmin = 2 ). T # Forward propagation, Forward pass # TODO: Hidden layer hidden_inputs = np. dot (self. weights_input_to_hidden, inputs) # signals into hidden layer hidden_outputs = self. activation_function (hidden_inputs) # signals from hidden layer # The excitation function of the output layer is y = x final_inputs = np. dot (self. weights_hidden_to_output, hidden_outputs) # signals into final output layer final_outputs = final_inputs # signals from final output layer ### Back Propagation Backward pass, use gradient descent to update the weight #### Output error # output layer error is the difference between desired target and actual Output. output_errors = (targets_list-final_outputs) # Backpropagation error Backpropagated error # errors propagated to the hidden layer hidden_errors = np. dot (output_errors, self. weights_hidden_to_output) * (hidden_outputs * (1-hidden_outputs )). T # Update weight update the weights # Update the weight between the hidden layer and the output layer update hidden-to-output weights with gradient descent step self. weights_hidden_to_output + = output_errors * hidden_outputs.T * self. lr # update the weight between the input layer and the hidden layer update input-to-hidden weights with gradient descent step self. weights_input_to_hidden + = (inputs * hidden_errors * self. lr ). T # prediction def run (self, inputs_list): # Run a forward pass through the network inputs = np. array (inputs_list, ndmin = 2 ). T #### Implement forward Propagation Implement the forward pass here #### Hidden layer hidden_inputs = np. dot (self. weights_input_to_hidden, inputs) # signals into hidden layer hidden_outputs = self. activation_function (hidden_inputs) # signals from hidden layer # Output layer final_inputs = np. dot (self. weights_hidden_to_output, hidden_outputs) # signals into final output layer final_outputs = final_inputs # signals from final output layer return final_outputs

The above is all the content of this article. I hope it will be helpful for your learning and support for helping customers.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.