Lecture 14:radial Basis Function Network14.1 RBF Network hypothesis
Figure 14-1 RBF Network
As can be seen from Figure 14-1, the RBF nnet is not unique. is to use the RBF kernel as the activation function. Why do you want the RBF nnet? Is it universally acknowledged that the RBF nucleus is very good? If there are many mathematical formulae to go, will there be infinitely many nnet?
What is the difference between a RBF neural network and a BP neural network? 》 。
First, the RBF nnet is computationally fast. Because the RBF nnet really only 3 layers, the right half of figure 14-1 is true AH is not ideographic AH (figure 14-1 of the left half of the figure is ideographic Ah, not necessarily 3 layer ah AH).
Second, there is a conclusion in biology that "neurons in the brain work that way." When you smell the flowers, you won't be stimulated to feel the spicy neurons. " The RBF is a local activation function, and the closer to the memory point, the more easily it is activated.
Figure 14-2
14.2 RBF Network Learning
This section is basically a discussion of why you don't use the full RBF nnet and the full RBF nnet feature. It naturally leads back to the 3rd section.
14.3 K-means algorithm
14.4 K-means and RBF Network in Action
Off Topic:
T1:CNN, the RBF nnet these nnet all contain the biological significance, other nnet also should contain the certain significance!
Lecture 14:radial Basis Function Network