&http://www.aliyun.com/zixun/aggregation/37954.html ">NBSP; For multilayer feedforward Neural networks, the Romclhert and Mcclclland and the 1985 error reverse propagation algorithm, namely the BP algorithm, are widely used. This network structure is shown in the figure:
It is actually a multilayer perceptron consisting of input nodes, output nodes, and hidden-layer nodes (one or more layers). There is no connection between the same layer nodes, and the previous layer and the latter layer are fully connected to each other.
Activate function
For the input signal, the previous propagation to the hidden node, after activating the function, finally by the output node gives the result. The requirement for an activation function is a continuous differentiable, not-minus function, as shown in the following figure:
The BP network is the approximate mapping of the m-dimensional space vector to n-dimensional space. Because f (x) uses nonlinear function, it has a certain range of fault tolerance, which makes BP network more convenient than the general linear threshold Unit network.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.