Caffe: How to determine the caffe in the forward and back?

Source: Internet
Author: User
Someone has been on Caffe does all the bookkeeping for any DAG of layers to ensure correctness of the forward and backward. This sentence has doubts. I give an explanation:

First, the whole process of determining caffe and retransmission is given: first, the creator function of the layer is obtained from the string of the parameter file to the registry of the layer, then the instance of the layer is created and then pressed into the Layers_, then the corresponding layer's name is pressed into the layer_name.
The decision to reverse the transmission is: based on the rate of study, if the learning rate is 0, it means that the layer does not need to be transmitted back. This creates a DAG, defined by the user.

The following code explains how the order of the forward passes is determined:
Do you need to obtain the shared layer from root Solver
    if (share_from_root) {
      LOG (INFO) << "sharing layer" << Layer_param.name () & lt;< "from Root net";
      The root solver gets the layer and then presses into the Layers_ container
      layers_.push_back (root_net_->layers_[layer_id);
      Layers_[layer_id]->setshared (true);
    else {
      //otherwise obtain the layer from the layer's registry according to the layer's parameters and press into the
      Layers_.push_back (Layerregistry<dtype>::createlayer (Layer_param));
    }
    //Press the name of the layer into the Layer_names_
    Layer_names_.push_back (Layer_param.name ());
    if (Caffe::root_solver ()) {
      LOG (INFO) << "Creating Layer" << layer_param.name ();
    }
    BOOL Need_backward = false;

The following code explains how the need for a reverse transmission is determined:
Iterate through each of the parameters for the layer for
    (int param_id = 0; param_id < num_param_blobs; ++param_id) {
        //PARAMSPEC Specifies training Parameters (multipliers on global learning constants,
    //and the name and other settings used for weight sharing). 
  
   //whether the current param_id is less than param_size, if it is less than the param_spec in the layer parameter, otherwise use default_param_spec
      const paramspec* = (param_spec ID < param_size)?
          &layer_param.param (param_id): &default_param_spec;
      parameter is required to reverse the
      const BOOL Param_need_backward = Param_spec->lr_mult ()!= 0;
      Or Operation
      Need_backward |= Param_need_backward;
      Whether the layer needs to be
      Layers_[layer_id]->set_param_propagate_down (param_id,
                                                  Param_need_backward) According to the final result;
    
  



The specific code is inside the init function in Net.cpp.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.