Perceptron Learning algorithm----Neural network

Source: Internet
Author: User

Transfer from http://blog.csdn.net/stan1989/article/details/8565499

Machine Learning---perceptron learning algorithm Introduction

Here we begin to introduce the knowledge of neural networks (neural Networks). First, we will introduce a few supervised learning algorithms, followed by non-supervised learning. First, the Perceptron Learning algorithm Basic Introduction
1. Neural Networks

Like evolutionary computing, neural networks are a similar concept. A neural network consists of one or more neurons. And a neuron includes inputs, outputs, and "internal processors." The neuron receives information from the input, processes the information through an "internal processor", and finally outputs it through the output. 2. Sensor

The Perceptron (Perceptron), a concept in the neural network, was first introduced by Frank Rosenblatt in 1950s. 3. Single-layer Perceptron

Single layer Perceptron is the simplest neural network. It contains the input layer and the output layer, and the input layer and the output layer are directly connected.


Figure 1.1

Figure 1.1 is a single-layer perceptron, very simple structure, the input layer and the output layer directly connected.

Next, let's look at how to calculate the output.


Using equation 1 To calculate the output layer, this formula is also very well understood. First the input layer is computed, each input is multiplied by the weights on it, and then the flight is added to the flight. For this flight and do the following treatment, if the flight and greater than the critical value (typically 0), the input is taken 1; if it is less than the threshold, take-1.

The code for a single-layer perceptron is given below.

[CPP]  View plain copy print?   //singlelayer  Perceptrons (SLP)    bool slp_calculate_output (constdouble * inputs,constdouble *  weights,intninputs,int & output)    {       if (NULL  ==inputs | |  null == weights)            return false;        double sum =0.0;       for  ( Int i = 0 ; i < ninputs ; ++i)         {           sum +=  (weights[i] *  Inputs[i]);       }  //Our handling of the flight: if it is greater than 0, the output value is 1; in other cases, the output value is -1       if (sum >0.0)            output = 1;       else           output = -1;  }  ///////////////   

A single-layer perceptron with its simple features can provide fast computation. It can realize the simple calculation of not, or, and in logical calculation.

But for a slightly more complex XOR or inability to be powerless. This problem can be solved by the multilayer perceptron described below. 4. Multilayer Perceptron

Multilayer Perceptron (multi-layer perceptrons) with multi-layer calculations.

With respect to the single-layer perceptron, the output is changed from one to the other, and there is not only one layer between the input and the output, but now two layers: the output layer and the hidden layer.


Figure 2.2

Figure 2.2 is a multilayer perceptron.

The calculation of multi-layer perceptron is also very easy to understand. First calculate each one with equation 1.

Take a look at the code and you'll see how it works.

[CPP]  View plain copy print?   //multi-layerperceptrons (MLP)    const unsignedint ninputs  =4;   const unsignedint  noutputs = 3;   const unsignedint nhiddens = 4;   struct  mlp   {       doubleinputs[ninputs+1];//One, storage of the bias, generally deposited into the 1       doubleoutputs[nOutputs];       doublehiddens[nHiddens+1];  //one, storage of bias, generally deposited into 1       doubleweight_hiddens_2_inputs[nhiddens+1][ ninputs+1];       doubleweight_outputs_2_hiddens[nOutputs][nHiddens+1];   };  //This is our handling of the flight and: if it is greater than 0, the output value is 1, and in other cases, the output value is -1   double sigmoid  (double  val)    {       if (val >0.0)            return1.0;       else           return-1.0;  }  //COMPUTE OUTPUT    bool  mlp_calculate_outputs (MLP&NBSP;*&NBSP;PMLP)    {  

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.