First, the activation function: 1. Hard limit function: used for classification 2. Linear function: for function approximation 3. Saturated linear function: 4.Sigmoidal function: s function, is continuous, the weights can be adjusted by BP algorithm 5. Gaussian function two. Learning rules: 1. Hebb rule 2. Discrete Perceptron Learning Rules 3.δ Learning rules 4.widrow-hoff Learning rules Three, the topological structure of neural network 1. Forward neural Network: A neural network with no-loop graph, no feedback! In addition to the input layer, both the hidden layer and the output layer neurons have certain computations, and thus become compute nodes. Two-layer sensor network Multilayer sensor network: All the compute nodes are hard limit functions, then the network is a multilayer discrete sensor. So the compute nodes are the S function, the so-called BP network. At this point, the network weights and thresholds can be learned by using the error-Backward propagation learning algorithm (BP algorithm)! The activation function of the output node of the BP algorithm varies depending on the application. If used for classification, the output layer node generally uses s function or hard limit function. If used for function approximation, the output node should use a linear function. Radial basis function (RBF) neural network: An input layer, a hidden layer, an output layer. The hidden layer base function goes to the distance function, the activation function uses the Gaussian function! 2. Feedback Neural Network: There is a loop! The most famous for the Hopfield neural network!
Some of the basic concepts