0% found this document useful (0 votes)
6 views

Bidirectional_associative_memory

Bidirectional associative memory (BAM) is a type of recurrent neural network introduced by Bart Kosko in 1988, which can return a different-sized pattern based on a given input pattern. It consists of two fully connected layers of neurons, allowing bi-directional communication and the ability to store and recall associations. The memory capacity of BAM is determined by the number of units in each layer, and its stability is analyzed through an energy function that ensures convergence to a stable state.

Uploaded by

oasisolga
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Bidirectional_associative_memory

Bidirectional associative memory (BAM) is a type of recurrent neural network introduced by Bart Kosko in 1988, which can return a different-sized pattern based on a given input pattern. It consists of two fully connected layers of neurons, allowing bi-directional communication and the ability to store and recall associations. The memory capacity of BAM is determined by the number of units in each layer, and its stability is analyzed through an energy function that ensures convergence to a stable state.

Uploaded by

oasisolga
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Bidirectional associative memory

Bidirectional associative memory (BAM) is a type of recurrent neural network. BAM was introduced
by Bart Kosko in 1988.[1] There are two types of associative memory, auto-associative and hetero-
associative. BAM is hetero-associative, meaning given a pattern it can return another pattern which is
potentially of a different size. It is similar to the Hopfield network in that they are both forms of
associative memory. However, Hopfield nets return patterns of the same size.

It is said to be bi-directional as it can respond to inputs from either the input or the output layer.[2]

Topology
A BAM contains two layers of neurons, which we shall denote X and Y. Layers X and Y are fully
connected to each other. Once the weights have been established, input into layer X presents the pattern
in layer Y, and vice versa.

The layers can be connected in both directions (bidirectional) with the result the weight matrix sent from
the X layer to the Y layer is and the weight matrix for signals sent from the Y layer to the X layer is
. Thus, the weight matrix is calculated in both directions.[2]

Procedure

Learning
Imagine we wish to store two associations, A1:B1 and A2:B2.

A1 = (1, 0, 1, 0, 1, 0), B1 = (1, 1, 0, 0)


A2 = (1, 1, 1, 0, 0, 0), B2 = (1, 0, 1, 0)
These are then transformed into the bipolar forms:

X1 = (1, -1, 1, -1, 1, -1), Y1 = (1, 1, -1, -1)


X2 = (1, 1, 1, -1, -1, -1), Y2 = (1, -1, 1, -1)
From there, we calculate where denotes the transpose. So,
Recall
To retrieve the association A1, we multiply it by M to get (4, 2, -2, -4), which, when run through a
threshold, yields (1, 1, 0, 0), which is B1. To find the reverse association, multiply this by the transpose of
M.

Capacity
The memory or storage capacity of BAM may be given as , where " " is the number of units
in the X layer and " " is the number of units in the Y layer.[3]

The internal matrix has n x p independent degrees of freedom, where n is the dimension of the first vector
(6 in this example) and p is the dimension of the second vector (4). This allows the BAM to be able to
reliably store and recall a total of up to min(n,p) independent vector pairs, or min(6,4) = 4 in this
example.[1] The capacity can be increased above by sacrificing reliability (incorrect bits on the output).

Stability
A pair defines the state of a BAM. To store a pattern, the energy function value for that pattern
has to occupy a minimum point in the energy landscape.

The stability analysis of a BAM is based on the definition of Lyapunov function (energy function) ,
with each state . When a paired pattern is presented to BAM, the neurons change states
until a bi-directionally stable state is reached, which Kosko proved to correspond to a local
minimum of the energy function. The discrete BAM is proved to converge to a stable state.

The Energy Function proposed by Kosko is for the bidirectional case, which for a
particular case corresponds to Hopfield's auto-associative energy function.[3] (i.e.
).

See also
Autoassociative memory
Self-organizing feature map
References
1. Kosko, B. (1988). "Bidirectional Associative Memories" (http://sipi.usc.edu/~kosko/BAM.pdf)
(PDF). IEEE Transactions on Systems, Man, and Cybernetics. 18 (1): 49–60.
doi:10.1109/21.87054 (https://doi.org/10.1109%2F21.87054).
2. "Principles of Soft Computing, 3ed" (https://www.wileyindia.com/principles-of-soft-computing
-3ed.html). www.wileyindia.com. Retrieved 2020-08-15.
3. RAJASEKARAN, S.; PAI, G. A. VIJAYALAKSHMI (2003-01-01). NEURAL NETWORKS,
FUZZY LOGIC AND GENETIC ALGORITHM: SYNTHESIS AND APPLICATIONS (WITH
CD) (https://books.google.com/books?id=bVbj9nhvHd4C). PHI Learning Pvt. Ltd. ISBN 978-
81-203-2186-1.

External links
Bidirectional Associative Memory – Python source code for the Wiki article (https://github.co
m/viralpoetry/bam)
Bidirectional associative memories – ACM Portal Reference (http://portal.acm.org/citation.cf
m?id=46936#)

Retrieved from "https://en.wikipedia.org/w/index.php?title=Bidirectional_associative_memory&oldid=1250280942"

You might also like