(vi) 6.15 neurons Networks deep belief Networks

Source: Internet
Author: User

Hintion in a 06 science paper that RBMs can be stacked up and trained by layers of greed, called Deep belife Networks (DBN), a network of high-level features that can learn the training data , DBN is a generation model in which a visible variable is associated with a hidden layer:

Here x = H0, for the condition distribution of the visible element of the RBM under the condition of the hidden layer element of the K-layer, is a condition distribution of a DBN top visible layer and a hidden layer, under:

Training of DBN:

1. First fully trained first RBM; 2. The weights and offsets of the first RBM are fixed, and then the state of its recessive neurons is used as the input vector for the second RBM; 3. After the second RBM is fully trained, the second RBM is stacked above the first RBM; 4. Repeat the above three steps any number of times; 5. If the data in the training set is labeled, then in the top-level RBM training, in addition to the dominant neurons in the display of the RBM, it is necessary to have the neurons representing the categorical tags, and train together: a) assuming that the top level RBM has 500 dominant neurons, the classification of the training data is divided into 10 categories; b) Then The top level RBM has 510 dominant neurons, and the corresponding labeled neurons are opened to 1 for each training data, while others are closed to 0. 6. For a 4-layer DBN is trained as follows: (the green part of the figure is the label that participates in the topmost RBM) DBN fun-tuning, fine-tuning stage: The generation model uses contrastive wake-sleep algorithm for tuning, the algorithm process is: 1. In addition to the top level RBM, the weights of other layers of RBM are divided into upward cognitive weights and downward generating weights; 2. Wake Stage: A cognitive process that produces an abstract representation (node state) of each layer through external features and upward weights (cognitive weights), and uses gradient descent to modify the downstream weights (generation weights) between layers. That is, "if the reality is different from what I imagined, changing my weights makes me think of something like this." 3. Sleep Stage: The build process, through the top level (the concept of waking up) and downward weights, generate the underlying state, while modifying the weight between the layers. That is, "if the vision of the dream is not the corresponding concept in my mind, changing my cognitive weight makes this vision seem to me the concept."use of DBN:1. Use the random recessive neuron state value to perform a sufficient number of Gibbs samples in the top-level RBM, 2. Spread downward to get the status of each layer.

(vi) 6.15 neurons Networks deep belief Networks

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.