Bidirectional Associative Memory neural network

Source: Internet
Author: User

  the study of associative memory networks is an important branch of neural networks Span style= "Font-family:symbol01", b • kosko In 1988 The proposed bidirectional associative memory (bidirectional associative Memory ,bam) Span style= "font-family:fzssk--gbk1-0" > network is the most widely used. The hopfiled network described earlier can implement Self-association, the specific content can refer to the blogFeedback Neural Network Hopfield Network". The BAM network can realize bidirectional different association, and it has many forms, such as discrete type, continuous type and adaptive type. This blog post focuses on discrete BAM networks. First, BAM network structure and principle

BAM Networkis a two-tier bidirectional network,when you add a loss to one of the layerswhen entering the signal,another layer can be output. due to the initialmode can be used on any layer of the network,information can bebidirectional Propagation,so there's no explicit input layer or outputLayer. One of these layers can be calledXLayer,have aNa godvia meta-node;The other layer is called aYLayer,YesMA neuron node。Two-tier state vectors are preferable to unipolar binary0 or 1, can also take the bipolar discrete value 1-1. x y w ,Y x The weight matrix of w t
/span>

BAMthe process of realizing bidirectional dissimilar association is the process of network running from dynamic to steady state.. the moment of the established weighted valueArray ofBAMNet,when you enter a sampleXPActing onXSide Time,This side outputX(1) =XPthroughWArray weighted toYside,through the transfer function of the side nodeFythe output is obtained after nonlinear transformationY(1) =Fy(WX (1));then thethe output is passedWTarray weighting fromYSide Pass backXside as input,throughXtransfer function of the side nodeFxfor non-The output is obtained after the linear transformationX(2) =Fx[WTY(1)]=Fx{[WT[Fy( WX (1)) ]}. this bidirectional round-trip processUntil the state of all neurons on both sides is no longer changed.. the network state at this time is called steady state,correspondingof theYSide output VectorYPis the patternXPresults obtained by bi-directional Association. similarly,if theYside-fed modeYP,through these two-way associative process, The x side will output the Lenovo result x. This bidirectional associative process can be expressed as:


the corresponding calculation formula is as follows:
for a fully trained weight matrix , when a torn, stored mode is entered on the BAM network side , the network The limited operation of the network can not only realize the correct dissimilar association on the other side , but also build a complete input mode in the input . In other words, from the incomplete input mode, you can realize the different association through BAM, but also can reconstruct the complete input mode, this function will be very useful. For example, the following will be shared by a BAM network implementation is so, through (x, y) a few pairs need to store the model, calculate the required weight matrix, examples of the choice is the name and telephone of the double Association, by entering the missing person name, calculate the good BAM Network can realize the association of telephone numbers, At the same time, the incomplete name of the complete complement.
Ii. energy function and stability analysis of BAM network structuresimilar to the Hopfield network, if the threshold T of the BAM Network is 0, the energy function is defined as:
the dynamic process of BAM Network Bidirectional Association is the gradual reduction of the number of energy functions along the discrete trajectory in its state space. When two-way steady state is reached, the network must fall into a local or global energy minimum point. For the specific reasoning process, this blog post is no longer detailed. After some deduction, we can get the following formula of energy change:
The above indicates that the power of BAM Network is declining continuously during the dynamic operation, when the network reaches the minimum energy point, it enters the stable state, and the state of both sides of the network no longer changes. The proving process has no limitation on the Learning rules of BAM net weights matrix, and the results of stability are not synchronous or asynchronous in the way of state updating . Considering that the synchronous update is more powerful than the asynchronous update, the convergence speed is faster than the serial asynchronous method, so the common synchronous update mode is adopted.

third, the weight design of BAM Network structurefor discrete BAM networks , the General optional transfer function f (•) = sign(•). When the network only needs to store a pair of patterns (x1,y1), shut up it into a stable state of the network, should meet the following conditions:
when it is necessary to store P-pair mode, the above conclusion is extended to the outer product of P-mode and the weight-value learning formula proposed by Kosko is obtained:

< Span style= "Color:rgb (56,56,56); Font-family:symbol01 ">   with external product and method design weight matrix, can not guarantee any p to all correct association of the pattern, but the following theorem shows that If the memory mode is limited, the BAM network with external product and method has better associative ability.
  theorem: If p memory mode xp,p =1,2, ..., p,x∈{-1, 1} consisting of the n-dimensional mode, 22 orthogonal, and the weight matrix W by the above, then to the BAM Network input p memory mode of any XP, only one time can correctly associate the corresponding mode YP.
A specific example: associative processes with noisy characters, such as:

iv. Application of BAM network
The design of BAM network is simple, only a few sets of typical input and output vectors constitute the weight matrix. The corresponding information output can be obtained from the measured data vector and weight matrix in the operation. This is a large-scale parallel processing of large amounts of data efficient method, with real-time and fault tolerance. More attractive is that the associative memory method does not need to preprocess the input vectors . Can be directly into the search, eliminating the coding and decoding work.


v. Implementation of the BAM networkThe following code implements the BAM network, and applies the BAM network storage memory three names and their phone numbers, according to the weight design method described above, the network weight design, and through the name input containing noise, to achieve the name and telephone number of the double association. See the following code for the specific implementation (thanks to the big guy's portrait contribution:)


**********************************************************

2015-8-7



Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.

Bidirectional Associative Memory neural network

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.