Hopfield Nets Study Notes

Source: Internet
Author: User
Tags nets

Recently want to learn the RBM (limit Boltzmann machine), find Hinton in Coursera on the neural Network course, first review Hopfield network bar.

Hopfield Nets with Hidden Unit

https://class.coursera.org/neuralnets-2012-001/lecture/125

In simple terms, the state of the hidden node is used to express the information of the input node,

For example, an edge in a two-dimensional image may correspond to no number of edges in the three-dimensional world,

If each possible segment of the image is represented by a "2d-line" cell, the cells form a collection, and any particular picture can only activate a small subset of the set.

If each possible 3D segment in the scene is represented by a "3d-line" cell, each "2d-line" cell may be a projection of many "3d-line" cells, and we can activate its corresponding "3d-line" unit with the activated "2d-line" cell. Since we can only see one straight line at a time, we let these "3d-line" units compete. (The red line in the figure indicates a competitive relationship)

When the lines represented by the two "3d-line" units intersect in a three-dimensional world, let the two cells support each other, that is, add a connection between them.

If they form right angle, we join a strong connection.

Now we have a network that records how segments in the real world are connected and how they are projected onto the image.

Give this network an image and it will form a interpretation for this image.

However, for an image, we can have different interpretation, (for example, for the Nake block, there can be two kinds of understanding, the Nake block is an ambiguous figure, one way of interpretation is to look at a higher position of the transparent cube top view, Another way to interpret this is to look at the bottom view of a transparent cube in a lower position. )

We use the low energy state to express a good interpretation.

Question one: How to avoid the energy function falling into local extremum

Question two: How to learn the weights between nodes (input nodes and hidden nodes, hidden nodes and hidden nodes)

Using stochastic units to improv search

https://class.coursera.org/neuralnets-2012-001/lecture/127

This lesson focuses on how to avoid falling into local extrema by adding noise.

The decision of Hopfield nets always makes the energy function decrease, so it is easy to get into local minima.

We can use random noise to avoid falling into local minima.

Start by adding a lot of noise so that it can easily pass through the energy barrier (barrier)

Then slowly reduce the noise, so that the entire system eventually into a deep minimun, the process is "simulated annealing (simulated annealing)", the algorithm and Hopfield nets was proposed at the same time.

So, how does the temperature affect the transfer probability (transition probability)?

When the temperature is high, the system can easily pass through the energy barrier from the local minimum, but the probability of jumping out of the deep minimun is also larger.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.