Introduction of artificial neural network and single-layer network implementation of and Operation--aforge.net Framework use (v)
The previous 4 article is about the fuzzy system, it is different from the traditional value logic, the theoretical basis is fuzzy mathematics, so some friends looking a little confused, if interested in suggesting reference related books, I recommend the "Fuzzy Math Tutorial", the defense industry Press, speaking very full, and very cheap (I bought into 7 yuan).
Introduction to Artificial neural networks
Artificial neural Network (ANN) is a mathematical model for information processing using a structure similar to the synaptic connection of the brain. It is an operational model, consisting of a large number of neurons and connections, each of which represents a specific output function called the excitation function (activation functions). The connection between each of the two nodes represents a weighted value for passing the connection signal, called the weight (weight), for simulating memory. The output of the whole network is different depending on the connection mode, weight value and excitation function of the network. The network itself is usually the approximation of some kind of algorithm or function in nature, and it may be the expression of a logical strategy.
The advantages of artificial neural network are obvious, mainly embodied in the following three aspects:
1. With self-learning function
2. With Lenovo Storage function
3. Ability to find optimized solutions at high speed
For more information, please refer to the relevant materials
Aforge.net single-layer network implementation and operation
The realization of neural network in Aforge.net is mainly in Aforge.neuro, which is obtained by Install-package Aforge.neuro.
We follow the general steps to:
1. Build the Model
And operations do not have to speak more, tidy up the input and output:
[0,0] ===> [0]
[1,0] ===> [0]
[0,1] ===> [0]
[===>] [1]
It can be easily seen that the input is 2, the output is 1 nodes, the layer of a single layer is sufficient.
Code:
//organize input and output data
Double[] input =New Double[4][];Double[] Output =New Double[4][];
input[0] =New Double[] {0,0}; output[0] =New Double[] {0};
input[1] =New Double[] {0,1}; output[1] =New Double[] {0};
input[2] =New Double[] {1,0}; output[2] =New Double[] {0};
input[3] =New Double[] {1,1}; output[3] =New Double[] {1};
2. Selecting incentive functions and learning rules
The excitation function in aforge.net needs to implement the Iactivationfunction interface, which realizes 3 kinds of aforge.net:
Bipolarsigmoidfunction
Sigmoidfunction
Thresholdfunction (Threshold function)
Our excitation functions (activation function) Use the threshold function.
Then consider the learning function.
Aforge.net learning function to implement the isupervisedlearning or Iunsupervisedlearning interface, the library implemented 5 kinds:
Among them, Perceptron learning (perceptual learning) can be said to be the first neural network learning algorithm, which appeared in the 1957, often used for the classification of linearly segmented data.
Code:
//
New Activationnetwork (new21
//
New
3. Training Network
Teacher. Runepoch (input, output);
4. Get output for processing
Because the original simulation algorithm, so there is no processing, we simulate a look at the effect on the line.
//
for (int04
Console.WriteLine ("input{0}: ===> {1},{2} sim{0}: ===> {3} ", I, input[i][0], input[i][1], Network.compute (Input[i]) [0
}
Full code:
//organize input and output data
Double[] input =New Double[4][];Double[] Output =New Double[4][];
input[0] =New Double[] {0,0}; output[0] =New Double[] {0};
input[1] =New Double[] {0,1}; output[1] =New Double[] {0};
input[2] =New Double[] {1,0}; output[2] =New Double[] {0};
input[3] =New Double[] {1,1}; output[3] =New Double[] {1};
for(inti =0; I <4; i++)
{
Console.WriteLine ("input{0}: ===> {1},{2} output{0}: ===> {3}", i,input[i][0],input[i][1],output[i][0]);
}
//set up network, number of floors 1, input 2, output 1, excitation function threshold function
Activationnetwork Network =NewActivationnetwork (NewThresholdfunction (),2,1);
//Learning method for Perceptron learning algorithm
Perceptronlearning teacher =NewPerceptronlearning (network);
//Defining Absolute Errors
DoubleError =1.0;
Console.WriteLine ();
Console.WriteLine ("Learning error ===> {0}", error);
//Output Learning Rate
Console.WriteLine ();
Console.WriteLine ("Learning rate ===> {0}", teacher. Learningrate);
//Number of iterations
intiterations =0;
Console.WriteLine ();
while(Error >0.001)
{
Error = Teacher. Runepoch (input, output);
Console.WriteLine ("Learning error ===> {0}", error);
iterations++;
}
Console.WriteLine ("iterations ===> {0}", iterations);
Console.WriteLine ();
Console.WriteLine ("SIM:");
//Simulation
for(inti =0; I <4; i++)
{
Console.WriteLine ("input{0}: ===> {1},{2} sim{0}: ===> {3}", I, input[i][0], input[i][1], Network.compute (Input[i]) [0]);
}
Effect:
The prospect of combining artificial neural network and fuzzy system
Let me explain why the artificial neural network is to be said after the fuzzy logic. Although fuzzy logic and neural network are two distinct fields, their basic theories are far apart, one is the new model and the other is the new set theory. But from the perspective of the integration of objective practice and theory, it is perfectly possible to bring them together. The combination of fuzzy logic and neural networks creates a new field of technology: Fuzzy neural networks.
The common forms are:
1. Logic Fuzzy Neural network
2. Arithmetic Fuzzy neural network
3. Hybrid Logic Neural Network
I personally feel that the combination of the two is actually a study and optimization of the weight coefficient problem.
For the logic fuzzy neural network using the error-based learning algorithm, the fuzzy BP algorithm is generally used for the arithmetic fuzzy nervous system, genetic algorithm. These two pieces of related technology are more mature. For the hybrid logic neural network, there is no specific algorithm, and it is used for computation rather than learning.
Maybe some friends think the first two are relatively new, I thought so at first, but I searched for nearly a decade of related papers (from the Wanfang), and most of the ideas and methods can be found in the early data of magazines such as Cybernetics (1980 up and Down).
Finally, I enclose 3 articles about Perceptron learning: http://www.ctdisk.com/file/4525564
Introduction of artificial neural network and single-layer network implementation and Operation--aforge.net Framework use (v)