1. Some basic symbols
2.COST function
================backpropagation algorithm=============1. To calculate something 2. Forward vector graph, but in order to calculate the bias, it is necessary to use the backward transfer algorithm 3. Backward transfer Algorithm 4. Small topic ======== ======backpropagation intuition==============1. Forward calculation is similar to backward calculation 2. Consider only one example, cost function simplification 3. Theta =======implementation Note:unrolling parameters======= 1. Expansion of parameters 2. Learning Algorithm ============gradient checking====================1. Numerical calculation of the gradient 2. Approximate calculation of all gradients 3. The essential reason for the rollback calculation instead of the gradient calculation 4. Realization Note Point ============random initialization=============1.zero Initial Not suitable for neural networks 2. Randomly symmetric initialization of ==========putting It together==============1. The more hidden layers, the greater the computational capacity. The hidden layer node is the same as the better. 2. The steps to train the neural network are actually similar to the regression. The key is to use the back to calculate the bias guide.
Study on neural network neural Networks learing