Stanford University CS231 Course notes 1_ Neural network

Source: Internet
Author: User
From linear regression to neural network


Mini-batchsgd

Forward propagation calculation loss reverse propagation calculation gradient, updating parameters according to gradient

Topological sort forward and reverse of graphs

Class Computationalgraph (object):
   def Forward (inputs):
   # 1.[ Pass inputs to input gates ...]
   # 2.forward The computational graph: for
    Gate in self.graph.nodes_topologically_sorted ():
      Gate.forward ()
    Return loss #the Final gate in the graph outputs the loss
   def backward (): For
    Gate in reversed (self.graph.nodes_to Pologically_sorted ()):
      Gate.backward () #little piece of backprop (chain rule applied) return
    inputs_gradients

Batch regularization of Batchnormalization

Advantages: Increase the gradient flow, but use a greater learning rate, reduce the dependence on initialization, by the role of regularization, reduce the use of dropout
Activate function

Data preprocessing

Learning Rate

Loss does not fall, the learning rate is too small
Loss explosion, learning rate is too large, when Nan, is the learning rate is too big
Attenuation of learning rate

1. Decrease after certain Epoch Times
2. Decrease in the number of points
3. Linear decrease over time
Optimization Method Adam


Rmsprop

The second order optimization method

Dropout



Why dropout is effective. Dropout is equivalent to a combination of a bunch of models, each opening and closing is a model, when testing, Monte Carlo estimate is to use a different model for all the results of the average. Or use one forward propagation to open all the nodes



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.