BP network structure
&http://www.aliyun.com/zixun/aggregation/37954.html ">NBSP; For multilayer feedforward Neural networks, the Romclhert and Mcclclland and the 1985 error reverse propagation algorithm, namely the BP algorithm, are widely used. This network structure is shown in the figure:
It is actually a multilayer perceptron consisting of input nodes, output nodes, and hidden-layer nodes (one or more layers). There is no connection between the same layer nodes, and the previous layer and the latter layer are fully connected to each other.
Activate function
For the input signal, the previous propagation to the hidden node, after activating the function, finally by the output node gives the result. The requirement for an activation function is a continuous differentiable, not-minus function, as shown in the following figure:
The BP network is the approximate mapping of the m-dimensional space vector to n-dimensional space. Because f (x) uses nonlinear function, it has a certain range of fault tolerance, which makes BP network more convenient than the general linear threshold Unit network.