Tutorial on building a Hopfield network using Python _python

Source: Internet
Author: User
Tags mongodb nets postgresql redis python list


Something hot is obviously going to cool. The room will get messy and frustrating. Almost the same, the message is distorted. The short-term strategy for reversing these conditions is to reheat, do the sanitation and use the Hopfield network respectively. This article introduces you to the last of the three, which is an algorithm that eliminates noise only if you need a specific parameter. Net.py is a particularly simple Python implementation that will show you how its basic parts are combined and why Hopfield networks can sometimes regain original patterns from distorted patterns. Despite the limitations of this implementation, you can still get a lot of useful and enlightening experience about the Hopfield network.
What are you looking for?



I assume you are reading this article because you are experiencing some computational problems. You have been advised that some neural network algorithms may provide solutions. Specifically, the suggestion is that you can use a Hopfield network. I further assume that you need to have a general idea so that you can decide whether the proposal is practical and secure in-depth research. The following very abbreviated application of the Hopfield network may lead you to solve the problem.



First, your question has a basic set of 1 and +1 coded patterns. If necessary, they can be encoded in 0 and +1. These patterns can be standardized binary patterns for stamps (see Resources). The next element is a set of patterns that deviate from this foundation. What you're looking for is creating code that allows you to enter an abnormal pattern and output a basic pattern that is due. So what you're looking for is an algorithm that can enter a description of the code for a particular stamp and then output a basic stamp pattern that's due. Your search is uncertain and will succeed. There are acceptable failure rates that have a negative impact on your plan. For you, there will be a rate of false recognition of stamps that will not significantly affect your project.



If this reminds you of your problem, the following may be the beginning of your solution design. Before you finish, you should be able to answer the basic questions. What is this Hopfield? How does it work? What are its limitations? What can it do for me? Do I want to spend more time studying it?



Pattern and its distortion


Let's first look at the five arbitrary patterns that will be distorted and subsequently obtained. They can be visualized as a 10-by-10 matrix of black and white squares. 





Click on any one of the net.py P2 to P5 to display other patterns. To encode, these five patterns are initially described as a Python list. So, for example, the first pattern is described in Listing 1.
Listing 1. Pattern P1


    P1 = [[1, 1, 1, 1, 1, 1, 1, 1, 1, 1],
 [-1,-1,-1,-1,-1,-1,-1,-1, 1, 1],
 [-1,-1,-1,-1,-1,-1,-1,-1, 1, 1],
 [-1,-1,-1,-1,-1,-1,-1,-1, 1, 1], [-1,-1,-1,-1, 1,-1, 1, 1
 ],
 [-1,-1,-1,-1,-1,-1, -1,-1, 1, 1], [-1,-1,-1,-1
 ,-1,-1,-1,-1, 1, 1],
 [-1,-1,-1,-1,-1,-1, 1, 1],
 [-1,-1,-1,-1 ,-1,-1,-1,-1, 1, 1],
 [-1,-1,-1,-1,-1,-1,-1,-1, 1, 1]]


The black and white squares correspond to-1 and +1, respectively. The list is then converted to an array. (See Resources for a reference to the Python library I use.) Corresponds to each element in such a pattern, 1 or +1, with a node object in the node array.



One node object has three primary properties:


    1. A node object has a value, which is an element of the pattern.
    2. A node also has an address, which is its address in an array.
    3. Each node also has a color so that it can be displayed.


As mentioned earlier, one function of Hopfield is to eliminate noise. To achieve this function, there is a need for a method to introduce noise into the pattern. Click Add noise to complete this task. 







To introduce noise into a pattern, Hopfield to access every address in the array of nodes. It then takes a random number in [0,1], that is, between 0 and 1 including 0 but excluding 1. If the number is less than a fixed standard, the network will change the value and color of the node, otherwise it will remain unchanged. By default, this standard is set to 0.20, so that any given node may have a 20% change in its value and color. You can use the adjustment slider to change this probability. When you tune to 0%, there is no noise, and when you tune to 100%, the array of nodes is simply reversed. Take the value of this interval and all other usual possibilities appear. Each value will introduce a specific degree of noise to a pattern. Since the Hopfield network is an algorithm for eliminating noise, it can enter a distorted pattern.



Although sometimes obscured by inappropriate interpretations, the relevant algorithms are fairly straightforward to implement. Next, I'll give you a complete introduction to an implementation of the algorithm, and then I'll explain briefly why these algorithms can eliminate noise.



Weight


As David Mertz and I described in a previous article in DeveloperWorks, the introduction to neural nets, the human brain consists of about 100 billion neurons, each of which is connected to thousands of other neurons. Neurons both receive and transmit different energies. An important characteristic of neurons is that they do not react immediately when they receive energy. Instead, they accumulate the energy they receive and send their energies to other neurons only when the accumulated energy reaches a certain critical limit.



When the brain is learning, it can be thought to be adjusting the number and intensity of these connections. There is no doubt that this is an extremely simplified biological fact. In this example, simplification can be useful for implementing a control neural network, especially if it is used as a model. The transformation from biology to algorithm is achieved by transforming the connection into a weight. (The Perceptron is used in a different and potentially more intuitive way to use weight.) Before reading here, you may want to read an introduction to neural nets again. )



The weight object mainly encapsulates a value that represents the weight between one node and another. The weight object also has an address and a color. The address is its position in the weight array. Color is used for display. It is a possible representation of an array of weights. net.py (see Resources for links) keeps track of the lowest and highest weights, and it displays a key of the color values in the weight display.







On each row of the weighted array, is a list of weights between a given node and all other nodes. There are two forms of Hopfield networks. One form of node has one weight to itself, and the other is not. The experience gained through net.py shows that when a node is not a self weighting (self-weighted), the array of nodes is not always refactored to itself. Select the No Self Weight option, and then try refactoring P3 or P5. There are 100 nodes, so there are 10,000 weights that are usually redundant. By default, when the node is self weighting, there will be 5,050 non-redundant weights, otherwise there are only 4,950.







Listing 2. Weight generation algorithm



PAT = {X:x is a rxc pattern}
WA = {X:x is a (r*c) x (r*c) Weight Array}
For all (I,j) and (A,B) in the range of R and C:
SUM = 0
For P in PAT:
SUM + = P (i,j) * p (a,b)
WA ((R*i) +j, (c*a) +b) = SUM



The biologically inspired concept is the foundation of the Hopfield network that was derived from the 1949 Donald Hebb study. He assumes that if a pair of nodes sends their energy to each other at the same time, the weights between them will be greater than the only one sending their own energy. He wrote: "When a axon of cell A is close enough to stimulate it, and can be repeatedly involved in the stimulation of it, one or all of the two cells will occur some growth process or metabolic changes, so that as a cell to stimulate B, the effect of a will increase" (see Resources for detailed Information). In the case of a Hopfield network, when a pair of nodes have the same value, in other words, 1 or + 1, the weights between them are greater. All possible node pairs of the value of the product and the weight of the determined array of the contents. When two values are the same, their product is positive and increases. In the case of different values, this and will be reduced.



In more detail, where does the weight come from? First, the Hopfield network must have access to a library or a set of basic patterns. Here is P1 to P5. The generation of weights first selects a pair of coordinates within the bounds of the basic pattern matrix by the Hopfield network. It then accesses the corresponding nodes in each pattern. In each step, it adds the product of the node value to a used and. When a network accesses each pattern, it sets the value of a weighted object to this and. Gives a pair of nodes located in (i,j) and (A,B) that set the value of the Weighted object (I*10+J,A*10+B) in the weighting array.



This is the process of how the weights are constructed, but how does it work for larger Hopfield algorithms? How does it work with pattern reconstruction?



Refactoring


If you have an array of weights at hand and a distorted or noisy pattern, the Hopfield network can sometimes output the original pattern. There is no guarantee, but the percentage of the network's correct number is staggering. It can be completed synchronously or asynchronously.



If it is done asynchronously, the network traverses the distorted pattern, and at each node n, it asks if the value of n should be set to-1 or +1.



To determine this setting, the network traverses the rows in the weight array that contain all the weights between N and other nodes. Don't forget that nodes may or may not be self weighted.



At each step of the second traversal, it calculates the product of the weight between (1) N and another node and (2) the value of another node. As you might expect, the network keeps a counter in use for these products.



Now the web can make a decision. At least in the current implementation, if this sum is less than 0, the network sets the node to 1, and if it is greater than or equal to 0, the network sets the node's value to +1.





Listing 3. Refactoring



For every node, N, in pattern P.
SUM = 0
For every node, A, in P:
W = weight between N and A
V = value of A
SUM + = W * V
If SUM < 0:
Set N ' s value to-1
Else
Set N ' s value to +1



The default update is asynchronous, because the network sets the value of a node only after determining what the value should be. If the network makes all the decisions and then sets the value of the node, then it can be synchronized. In this case, it stores its decision and then updates the array's nodes after the last decision is made. In net.py (see Resources), refactoring is done asynchronously by default, but pay attention to the option of synchronizing refactoring.



When you experience net.py, when refactoring succeeds, Hopfield network behavior is shocking. One such behavior is that even when the weight array is severely degraded, it can still reconstruct the pattern. The degraded weights of my simple implementations (degrade Weights) traverse the weights array and randomly set the weights to 0. A view of the magnitude of the weight to show the extent of the damage. Here, the correct refactoring shows that the fault tolerance of Hopfield networks is much higher than that of the brain. How does it work? The mathematical description is not short. Instead, here is a brief introduction to the structure.



What happened


The algorithmic details of the Hopfield network explain why it can sometimes eliminate noise. As with the usual algorithmic analysis, the most troublesome part is the mathematical details. In the current case, these are difficult to describe and imagine. Fortunately, there are some closely related phenomena that can make the work of the Hopfield network clearly visible.



When a pinball falls into a bowl formed by a simple surface, it rolls to its lowest point. The curvature of the bowl is like a rule, enter the entry point of the pinball and return to the bottom of the bowl. The more complex curvature will resemble a function that enters an entry point and returns one of several local lows.



Energy is an essential part of these simple phenomena. In both simple and complex cases, the bouncing ball has a measurable amount of energy. Over time, this energy will decrease. It will eventually reach a stable state that cannot be smaller. In a complex case, there may be a lower energy level, but the pinball cannot be achieved.



Similarly, a pattern can be considered to have a specific measure of energy, whether or not it is distorted. Therefore, the pattern P1 to the P5 has the energy level.


Pattern Energy Level

The calculation of the energy level of a pattern is not complicated. The Hopfield network calculates the product of the values of each possible node pair and the weights between them. The energy level of a pattern is the result of removing these products and resulting from negative 2. Net.py shows the energy level of any given pattern or array of nodes. When you refactor the pattern, I think and hopefully you will be able to see the drop in the pattern energy level.



In refactoring, the network makes a decision to flip a node based on the value of the other nodes and the product of the weights between them. When and less than 0 o'clock, the node is set to-1, otherwise set to +1. When the product of the value and the weight is positive, it helps to induce and exceed 0. However, this will push the network toward the trend of setting the node value to +1. When the product is negative, and is pushed to or less than 0. As a result, the network is pushed to the trend of setting the node to 1. The change of weight will cause the change of measurement and the trend of the network to be pushed in the process of judgment. Patterns can be very distorted, causing the network to not be pushed to a trend that makes the right decision. If there is no problem with the presentation, the network will be pushed to the right direction most of the time.



If you refactor any of those five patterns, you will find that each pattern is refactored to itself. It should be so, because each pattern already occupies a local minimum energy point. No refactoring process can reduce the energy level of the pattern again. If you successfully refactor a distorted pattern, Hopfield has reduced the pattern's energy level to the level of a pattern. When it fails, it has reduced the energy level of the distorted pattern to a false local low. In both cases, there can be no further reduction in energy levels. In other words, it has reached a state of stability. It is interesting and important to describe the Hopfield network in terms of energy. On this basis, it can be established mathematically, so that the repeated application of refactoring algorithm can eventually get a stable pattern. (See Resources for more information.) )



Conclusion


You should be aware of the limitations of the Hopfield network. One obvious limitation, which is often mentioned, is that its pattern must be encoded as an array, which is either composed of-1 and +1, or composed of 0 and +1. As you already know, Hopfield may stabilize at a false local low point. The more obvious limitation is that when the number of patterns exceeds about 14% of the number of nodes in the node array, the probability of a network stabilizing to a false local low is increased. That is, each additional basic pattern must be more than 7 nodes. Despite this limitation, the pattern refactoring discussed here is likely to be an intuitive guide to solving your specific computing problems. Now you've got a rough idea of the Hopfield algorithm that was originally mentioned. If it meets your needs, you now understand the superstructure of building your own implementation. This includes algorithms for calculating weighted arrays, ways to reconstruct distorted patterns, and algorithms for calculating the energy levels of patterns.


Alibaba Cloud Hot Products

Elastic Compute Service (ECS) Dedicated Host (DDH) ApsaraDB RDS for MySQL (RDS) ApsaraDB for PolarDB(PolarDB) AnalyticDB for PostgreSQL (ADB for PG)
AnalyticDB for MySQL(ADB for MySQL) Data Transmission Service (DTS) Server Load Balancer (SLB) Global Accelerator (GA) Cloud Enterprise Network (CEN)
Object Storage Service (OSS) Content Delivery Network (CDN) Short Message Service (SMS) Container Service for Kubernetes (ACK) Data Lake Analytics (DLA)

ApsaraDB for Redis (Redis)

ApsaraDB for MongoDB (MongoDB) NAT Gateway VPN Gateway Cloud Firewall
Anti-DDoS Web Application Firewall (WAF) Log Service DataWorks MaxCompute
Elastic MapReduce (EMR) Elasticsearch

Alibaba Cloud Free Trail

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.