Beruflich Dokumente
Kultur Dokumente
.in
rs
de
ea
yr
ab
or
ty
,w
.m
ha
kr
www.myreaders.info
Return to Website
Associative Memory
Soft Computing
Associative Memory (AM), topics : Description, content
address-
and noise, performance measure - memory capacity and contentaddressability. Associative memory models : network architectures linear associator, Hopfield model and bi-directional model (BAM).
Auto-associative memory (auto-correlators) : how to store patterns ?
how to retrieve patterns? recognition of noisy patterns. Bi-directional
hetero-associative memory (hetero-correlators) : BAM operations retrieve the nearest pair, addition and deletion of pattern pairs,
energy function for BAM - working of Kosko's BAM, incorrect recall of
pattern, multiple training encoding strategy augmentation matrix,
generalized correlation matrix and algorithm.
fo
.in
rs
de
ea
,w
.m
yr
Associative Memory
ab
or
ty
Soft Computing
ha
kr
Topics
21, 22, 23, 24
4 hours)
Slides
(Lectures
03-12
content-addressability.
13-20
Network architectures -
BAM,
42
fo
.in
rs
de
ea
yr
ha
kr
ab
or
ty
,w
.m
Associative Memory
memory :
auto-associative
and
hetero-associative.
to
implement
these
associative
memory
fo
.in
rs
de
ea
SC - AM description
ha
kr
ab
or
ty
,w
.m
yr
1. Associative Memory
An associative memory is a content-addressable structure that maps a
set of input patterns to a set of output patterns.
A content-addressable
structure
memory
where the
is accessed
from the input pattern not only in content but possibly also in type
and format.
04
fo
.in
rs
de
ea
SC - AM description
or
ty
,w
.m
yr
ha
kr
ab
Charles Darwin
Christopher Columbus
Blaise Pascal
Marco Polo
Neil Armstrong
Sigmund Freud
Fig.
fo
.in
rs
de
ea
yr
.m
w
w
,w
ty
SC - AM description
[Continued from previous slide]
kr
ab
or
ha
and
hetero-associative memory.
Auto-associative memory
y[3], . . . . . y[M],
pattern vectors
and
have
Linear
is
associater
the
simplest
artificial
neural
associative
memory.
Hopfield model
and
models
follow
memorize information.
06
different
neural
network
architectures
to
fo
.in
rs
de
ea
SC - AM description
Example
ab
or
ty
,w
.m
yr
ha
kr
a stored pattern.
Fig below illustrates the working of an associated memory.
The associated pattern pairs
Recalled
Pattern The association is represented
Input
Pattern
( , ), ( , +), (7 , 4).
by the symbol
The associated pattern pairs
fo
.in
rs
de
ea
SC - AM description
or
ty
,w
.m
yr
kr
ab
auto-associative and
ha
hetero-associative memory.
An
auto-associative correlator,
that
most
closely
Input
pattern
presented
Recall of
associated
pattern
Hetero-associative memory
Presented
distorted
pattern
Recall of
perfect
pattern
Auto-associative memory
fo
.in
rs
de
ea
SC - AM description
or
ty
,w
.m
yr
Content-addressability.
ha
kr
ab
Encoding or memorization
Building
an
associative
weight matrix W
memory
means,
constructing
connection
, where
(xi)k
(yj)k
for
i = 1, 2, . . . , m
and
j = 1, 2, . . . , n.
is accomplished by
k=1
Wk
where
fo
.in
rs
de
ea
SC - AM description
or
ty
,w
.m
yr
Retrieval or recollection
After memorization, the process of retrieving a stored pattern, given
ha
kr
ab
using
input j =
j=1
xi w i j
+1 if input j j
-1
where
10
other wise
fo
.in
rs
de
ea
SC - AM description
or
ty
,w
.m
yr
input
pattern
may
contain
ha
kr
ab
When
corrupted
input
pattern
is
presented,
the
network
will
many
processing
distributed computations.
11
are
elements
robust
and
performing
fault
tolerant
highly
because
parallel
and
fo
.in
rs
de
ea
SC - AM description
or
ty
,w
.m
yr
Performance Measures
The
kr
ab
of
memory
associative
capacity
and
memory
content-addressability
performance
for
correct
are
the
measures
retrieval. These
two
ha
capacity
refers
to
the
maximum
number
of
associated
is
to
retrieve
the
fo
.in
rs
de
ea
SC - AM models
or
ty
,w
.m
yr
ha
kr
ab
representations
networks.
The simplest associative memory model is Linear associator, which is a
feed-forward
type
of
network.
It
has
very
low
memory
capacity
and
Hopfield Model
fo
.in
rs
de
ea
SC - AM models
or
ty
,w
.m
yr
kr
ab
ha
the
can be
but
it
possesses a very low memory capacity and therefore not much used.
The
are
Hopfield Model
and
until
the
system
becomes
stable.
Hopfield
networks
are
linear associator,
therefore
the
but
the
connections
are
is
similar to
bi-directional
and
layers.
The
BAM
model
can
perform
both
auto-associative
of
these
three models
are
described
in
fo
.in
rs
de
ea
SC - AM models
or
ty
,w
.m
yr
kr
ab
architectures
to
memorize
ha
The
Linear
associator
model,
network,
consists, two layers of processing units, one serving as the input layer
while the other as the output layer.
The
The
is similar to
fo
.in
rs
de
ea
SC - AM models
or
ty
,w
.m
yr
kr
ab
ha
of processing units, one serving as the input layer while the other as
weights wij
w11
x1
y1
w21
w12
w22
x2
y2
w2m
inputs
outputs
w1m
wn1
wn2
Xn
Ym
wnm
Fig. Linear associator model
all n input units are connected to all m output units via connection
weight matrix
of
W = [wij]n x m
th
th
output unit.
the
connection
pattern
building
weight
matrix
stores
the
different
associated
associative
weight matrix W
memory
is
constructing
the
connection
then the stored pattern associated with the input pattern is retrieved.
[Continued in next slide]
16
fo
.in
rs
de
ea
yr
.m
w
w
,w
ty
SC - AM models
kr
ab
or
is called encoding.
ha
matrix Wk for an associated pattern pair (Xk, Yk) are computed as:
th
where
i = 1, 2, ..., m,
j = 1, 2, ..., n.
Weight matrix :
and
is accomplished
Wk
k=1
normalizing, usually
the
process
of
retrieving
stored
pattern,
is
called
decoding;
xi w i j where
j=1
input j stands for the weighted sum of the input or activation value of
Yj =
-1
where
other wise
Note: The output units behave like linear threshold units; that compute
a weighted sum of the input and produces a -1 or +1 depending
whether the weighted sum is below or above a certain threshold value.
Performance :
fo
.in
rs
de
ea
SC - AM models
or
ty
,w
.m
yr
kr
ab
pattern
pairs,
memory
are
means
stored
in
patterns
memory.
rather
Hopfield
than
model
associated
is
one-layer
ha
Hopfield network
W24
W14
W13
W23
W43
W42
W31
W21
W41
2
I1
I2
I3
V3
V2
outputs V
V1
W34
W32
W12
connection
weights wij
alternate view
neurons
I4
inputs I
V4
unit is connected to every other unit in the network but not to itself.
connection
ij
=wji
for i, j = 1, 2, . . . . . , m.
each
input j =
xi w i j + I j
i=1
for j = 1, 2, . . ., m.
X k Yk
where XK = YK
fo
.in
rs
de
ea
yr
ty
,w
.m
SC - AM models
[Continued from previous slide]
Weight Matrix :
is accomplished by
kr
ab
or
where
Wk
k=1
ha
the Hopfield
model is
or
retrieving
is
accomplished
xi w i j where input j
j=1
stands for the weighted sum of the input or activation value of node
for j = 1, 2, ..., n. and xi is the i
th
by
j,
Yj =
-1
where
other wise
that
compute a
how to store
and
how to
fo
.in
rs
de
ea
SC - AM models
or
ty
,w
.m
yr
kr
ab
by
incorporating
an
ha
auto-associations
memories.
as
The
the
Hopfield
additional
well
network
as
layer
to
perform
hetero-associations
structure
of
the
on
recurrent
the
bidirectional
stored
associative
for i = 1, 2, . . . , n
neurons
and
j = 1, 2, . . . , m.
weights wij
neurons
w11
x1
y1
w21
w12
w22
x2
y2
w2m
inputs
w1m
wn1
outputs
wn2
Xn
Ym
wnm
different associated
pattern
pairs
simultaneously,
is
X k Yk .
k=1
Wk
accomplished
fo
.in
rs
de
ea
SC - AM
auto correlator
or
ty
,w
.m
yr
ha
kr
ab
rather
than associated
pattern pairs,
are
stored
in memory.
In this
and
is
fo
.in
rs
de
ea
SC - AM
auto correlator
or
ty
,w
.m
yr
A1 , A2, A3
to be stored as
kr
ab
an auto-correlator.
ha
A1 = (-1, 1 , -1 , 1 )
A2 = ( 1, 1 ,
1 , -1 )
A3 = (-1, -1 , -1 , 1 )
Note that the outer product of two vectors U
U1
U2
U3
U4
= UT V =
V1 V2 V3
and
U1V1
U2V1
U3V1
U4V1
V is
U1V2
U2V2
U3V2
U4V2
U1V3
U2V3
U3V3
U4V3
A1 , A2, A3
patterns are
j
T
=
i
1 -1 1 -1
-1 1 -1 1
1 -1 1 -1
-1 1 -1 1
j
T
1 1 1 -1
1 1 1 -1
1 1 1 -1
-1 -1 -1 1
j
T
=
i
i=1
1 1 1 -1
1 1 1 -1
1 1 1 -1
-1 -1 -1 1
is
3 1 3 -3
1 3 1 -1
3 1 3 -3
-3 -1 -3 3
bipolar
fo
.in
rs
de
ea
SC - AM
auto correlator
or
ty
,w
.m
yr
previous
slide
shows
the
connection
matrix
ha
C
T = [t i j ] =
i=1
the
three
kr
ab
of
3 1 3 -3
1 3 1 -1
3 1 3 -3
-3 -1 -3 3
aj
old
= (ai t i j , aj
for j = 1 , 2 , . . . , p
= ai t i j
and
then find
i=
= ai
= ai
= ai
= ai
t i , j=1
t i , j=2
t i , j=3
t i , j=4
1x3
1x1
1x3
1x-3
new
Therefore
aj
new
a1
new
a2
new
a3
new
a4
+
+
+
+
1x1
1x3
1x1
1x-1
+
+
+
+
1x3
1x1
1x3
1x-3
4
+
+
+
+
-1x-3
-1x-1
-1x-3
-1x3
= 10
=
= 10
= -1
1
1
1
-1
old
= (ai t i j
, aj
) for j = 1 , 2 , . . . , p
is ( , )
= (10 , 1)
= (6 ,
1)
= (10 , 1)
= (-1 , -1)
( a1
23
new
, a2
new
, a3
new
a 4 , ) = ( -1, -1 , -1 , 1 ) = A3
fo
.in
rs
de
ea
SC - AM
auto correlator
or
ty
,w
.m
yr
A'
= ( 1,
1 ,
1 , 1 )
kr
ab
ha
find
the
proximity
of
the
noisy
vector
to
the
stored
patterns
X = (x1 , x2 , . . . , xn)
m
HD (x , y) =
where
| (xi - yi ) |
i=1
The HDs of A' from each of the stored patterns A1 , A2, A3 are
HD (A' , A1)
= |(x1 - y1 )|,
= 4
HD (A' , A2)
= 2
HD (A' , A3)
= 6
= ai
= ai
= ai
= ai
t i , j=1
t i , j=2
t i , j=3
t i , j=4
1
1x3
1x1
1x3
1x-3
new
Therefore
aj
new
a1
new
a2
new
a3
new
a4
2
1x1
1x3
1x1
1x-1
+
+
+
+
+
+
+
+
3
1x3
1x1
1x3
1x-3
+
+
+
+
4
1x-3
1x-1
1x-3
1x3
= 4
= 4
= 4
= -4
1
1
1
-1
old
= (ai t i j
, aj
) for j = 1 , 2 , . . . , p
is ( , )
= (4 , 1)
= (4 , 1)
= (4 , 1)
= (-4 , -1)
is A2 .
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
or
ty
,w
.m
yr
ha
kr
ab
in
previous
section.
Kosko
(1987)
extended
this
network
to
two-layer
and if the
model
then
recalls
pattern Y
given a
pattern
or vice-versa,
it
is
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
or
ty
,w
.m
yr
kr
ab
with
elements Ai
and
the
other layer
ha
in bipolar mode,
ON = 1 and OFF =
ON = 1 and OFF
and
= -1
M0 =
i=1
[ Xi ] [ Yi ]
the operations
deletion of
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
ty
,w
.m
yr
ab
or
Example
ha
kr
' = ( M )
" = (' M )
)
T
(F) = G = g1 , g2 , . . . . , gr ,
F = ( f1 , f2 , . . . . , fr )
M
is correlation matrix
1
if
fi >0
0 (binary)
gi =
fi < 0
previous g i ,
fi = 0
-1 (bipolar)
Kosko
has
proved
correlation matrix M.
27
that
this
process
will
converge
for
any
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
or
ty
,w
.m
yr
and a set
kr
ab
of correlation matrix M :
ha
or
an existing pair (Xj , Yj) can be deleted from the memory model.
= X1 Y1
+ X1 Y1 + . . . . + Xn Yn
+ X'
Y'
from the correlation matrix M , them the new correlation matrix Mnew
is given by
Mnew = M
Note :
The addition
functioning of
Yj )
the system
and forgetfulness.
28
( Xj
as a human
memory
similar to the
exhibiting
learning
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
or
ty
,w
.m
yr
kr
ab
ha
the
adding
new
patterns
must
not
destroy
the
previously
stored patterns.
The stability of a BAM can be proved by identifying the energy function E
with each state (A , B) .
For auto-associative memory : the energy function is
E(A)
= - AM A
E(A, B) = - AM B
it corresponds
presented
as
initial
condition
to
BAM.
The
neurons
change
their states until a bidirectional stable state (Af , Bf) is reached. Kosko
has shown that such stable state is reached for any matrix M
corresponds to local minimum of the energy function.
decoding
lowers
the
energy
( , ) is given by E = M
If the energy
(Ai , Bi)
E = Ai M Bi
if
the
energy
when it
Each cycle of
be recalled, even though one starts with = Ai. Thus Kosko's encoding
method does not ensure that the stored pairs are at a local minimum.
29
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
or
ty
,w
.m
yr
pair.
ha
kr
ab
B1 =
A2 =
B2 =
A3 =
B3 =
1 -1 -1 -1 -1
X2 =
( -1
X3 =
( -1 -1
Y1 =
1 -1 -1 -1
Y2 =
1 -1
1 -1
Y3 =
( -1
1 -1 -1 -1
1
1 -1 -1
1 -1
1 -3 -1
1 -3
M =
X1 Y1 +
X2 Y2 +
X3 Y3
-1 -1
1 -1
3
1
1
1 -1
-1 -1 -1
-3
-1
3 -1
1 -1
Suppose we start with = X3, and we hope to retrieve the associated pair
Y3 .
(' M T) = ( -1 -1 1 -1 1 1 ) = '
' M = ( -1 -1 1 -1 1 1 ) M = ( -6 6 6 6 -6 )
(' M) = " =
( -1 1 1 1 1 -1 )
= '
(f , f) = (X3 , Y3 )
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
or
ty
,w
.m
yr
kr
ab
ha
( 1 0 0 1 1 1 0 0 0 )
B1 =
1 1 1 0 0 0 0 1 0
A2 =
( 0 1 1 1 0 0 1 1 1 )
B2 =
1 0 0 0 0 0 0 0 1
A3 =
( 1 0 1 0 1 1 0 1 1 )
B3 =
0 1 0 1 0 0 1 0 1
( 1 -1 -1 1 1 1 -1 -1 -1 )
Y1 =
1 1 1 -1 -1 -1 -1 1 -1 )
X2 =
( -1 1 1 1 -1 -1 1 1 1 )
Y2 =
1 -1 -1 -1 -1 -1 -1 -1 1
X3 =
( 1 -1 1 -1 1 1 -1 1 1 )
Y3 =
( -1 1 -1 1 -1 -1 1 0 1
M = X1 Y1
-1
1
-1
3
-1
-1
1
-1
-1
+
3
-3
-1
-1
3
3
-3
-1
-1
X2 Y2 +
1
-1
-3
1
1
1
-1
-3
-3
1
-1
1
-3
1
1
-1
1
1
-1
1
-1
-1
-1
-1
1
-1
-1
X3 Y3
-1
1
-1
-1
-1
-1
1
-1
-1
1
-1
1
-3
1
1
-1
1
1
1
-1
-3
1
1
1
-1
-3
-3
-1
1
3
-1
-1
-1
1
3
3
fo
.in
rs
de
ea
yr
ty
,w
.m
SC - Bidirectional hetero AM
Y2 =
1 -1 -1 -1 -1 -1 -1 -1 1
ha
kr
ab
or
X2 =
( M) = (
-19 -13 -5
-5 -13 13
-1
-1
-11 -11 11
'
'
"
-1
-1
-1
' M T = ( -11 11
-1
-1
-19 -13 -5
-5 -13 13
-1
-1
(' M ) = (
-1
' M = (
(' M) = (
-1
-1
-1
= '
-1
-1
-1
F = ' =
-1
-1
-1
-1
-1
Y2
X2
But ' is not Y2 . Thus the vector pair (X2 , Y2) is not recalled correctly
by Kosko's decoding process.
( Continued in next slide )
32
fo
.in
rs
de
ea
yr
.m
w
w
ty
,w
SC - Bidirectional hetero AM
or
ha
kr
ab
(X2
Y2)
at a point which
-1 -1 -1
-1 -1 -1
= - 73
which is lower than E2 confirming the hypothesis that (X2 , Y2) is not
at its local minimum of E.
Note : The correlation matrix M used by Kosko does not
that
guarantee
if
and only if
this
pair
is at a local minimum
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
or
ty
,w
.m
yr
Kosko extended
the unidirectional
kr
ab
ha
M=
Xi
retrieve
the
nearest
pair
equations. However, Kosko's encoding method does not ensure that the stored
pairs are at local minimum and hence, results in incorrect recall.
which
pair weight
for
T
Xi
Yi
as
Yi
Xi
where qi
is viewed
It denotes the
minimum number of times for using a pattern pair (Xi , Yi) for training to
guarantee recall of that pair.
To recover a pair (Ai , Bi)
let us
Xi
Yi
The
next
few
slides
(q 1) Ai Xi
explains
Yi Bi
Multiple training encoding strategy for the recall of three pattern pairs
(X1 , Y1 ) , (X1 , Y1 ) , (X1 , Y1 ) using one and same augmentation matrix
M . Also an algorithm to summarize the complete
process of multiple
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
or
ty
,w
.m
yr
ha
kr
ab
( 1 0 0 1 1 1 0 0 0 )
B1 =
1 1 1 0 0 0 0 1 0
A2 =
( 0 1 1 1 0 0 1 1 1 )
B2 =
1 0 0 0 0 0 0 0 1
A3 =
( 1 0 1 0 1 1 0 1 1 )
B3 =
0 1 0 1 0 0 1 0 1
( 1 -1 -1 1 1 1 -1 -1 -1 )
Y1 =
1 1 1 -1 -1 -1 -1 1 -1 )
X2 =
( -1 1 1 1 -1 -1 1 1 1 )
Y2 =
1 -1 -1 -1 -1 -1 -1 -1 1
X3 =
( 1 -1 1 -1 1 1 -1 1 1 )
Y3 =
( -1 1 -1 1 -1 -1 1 0 1
Y2 =
( -1 1 1 1 -1 -1 1 1 1 )
1 -1 -1 -1 -1 -1 -1 -1 1
M = X1 Y1
2
0
4
-2
-2
2
0
0
4
-4
-2
-2
4
4
-4
-2
-2
+ 2 X2 Y2 +
2
-2
-4
0
2
2
-2
-4
-4
2
-2
0
-4
2
2
-2
0
0
0
0
-2
-2
0
0
0
-2
-2
0
0
-2
-2
0
0
0
-2
-2
X3 Y3
2
-2
0
-4
2
2
-2
0
0
2
-2
-4
0
2
2
-2
-4
-4
-2
2
4
0
-2
-2
2
4
4
fo
.in
rs
de
ea
yr
.m
w
w
,w
ty
SC - Bidirectional hetero AM
ha
kr
ab
or
( M) = (
-1
-1
-1
-1
-8 -14 -22 22
-1
-1
-1
-1
-1
Thus,
-1
-1
-1
-8 -14 -22 23
1
-1
-1
'
'
)
)
"
-1
-1
-1
F = ' =
-1
-1
-1
-1
-1
= Y2
(X2
matrix M .
Y2 )
is correctly
recalled,
using
augmented
X2
correlation
-1
fo
.in
rs
de
ea
yr
.m
w
w
,w
ty
SC - Bidirectional hetero AM
Note : The previous slide showed that the pattern pair (X2 , Y2 ) is correctly
ha
kr
ab
or
= X1 Y1 + 2 X2 Y2 + X3 Y3
but
then
the
same
matrix
can not
recall
correctly
the
other
( 1 -1 -1 1 1 1 -1 -1 -1 )
Y1 =
1 1 1 -1 -1 -1 -1 1 -1 )
-6
( M) = (
-1
24 22
1
22 -22
1
-1
)
)
'
'
-1
-1
' M = ( -14 28 22 14
14 22 -22
-1
"
(' M) = (
-1
-1
-1
-1
-1
-1
-1
-1
-1
-1
-1
F = ' =
-1
-1
Y1
X1
Thus, the pattern pair (X1 , Y1 ) is not correctly recalled, using augmented
correlation matrix M.
To tackle this problem, the
augmented
matrix
needs to be further
correlation
fo
.in
rs
de
ea
yr
.m
w
w
,w
ty
SC - Bidirectional hetero AM
Y1 )
cannot be recalled
ha
kr
ab
or
under the same augmentation matrix M that is able to recall (X2 , Y2).
problem can be solved
by multiple training
of (X1
Y 1)
However, this
M = 2 X1 Y1 + 2 X2 Y2 + X3 Y3
-1
1
-1
5
-1
-1
1
-1
-1
5
-5
-3
-1
5
5
-5
-3
-3
3
-3
-5
1
3
3
-3
-5
-5
1
-1
1
-5
1
1
-1
1
1
-1
1
-1
-3
-1
-1
1
-1
-1
-1
1
-1
-3
-1
-1
1
-1
-1
1
-1
1
-5
1
1
-1
1
1
3
-3
-5
1
3
3
-3
-5
-5
-3
3
5
-1
-3
-3
3
5
5
Now observe in the next slide that all three pairs can be correctly recalled.
( Continued in next slide )
38
fo
.in
rs
de
ea
yr
.m
w
w
,w
ty
or
ab
SC - Bidirectional hetero AM
(X1 , Y1 )
1 -1 -1 1 1 1 -1 -1 -1 )
Y1 =
1 1 1 -1 -1 -1 -1 1 -1 )
ha
kr
M
( M)
(' M T )
(' M T )
' M
(' M)
=
=
=
=
=
=
(
3 33 31 -3 -5 -5 -3 31 -31
(
1
1
1 -1 -1 -1 -1
1 -1
( 13 -13 -19 23 13 13 -13 -19 -19
(
1 -1 -1
1
1
1 -1 -1 -1
(
3 33 31 -3 -5 -5 -3 31 -31
(
1
1
1 -1 -1 -1 -1
1 -1
)
)
)
)
)
)
'
'
"
-1
-1
)
)
'
'
=
=
(
(
1
1
-1
1
-1
1
1
-1
1
-1
1
-1
-1
-1
-1
1
=
=
X1
Y1
(X2 , Y2 )
( -1 1 1 1 -1 -1 1 1 1 )
Y2 = (
1 -1 -1 -1 -1 -1 -1 -1 1 )
=
=
=
=
=
=
(
7 -35 -29 -7 -1 -1 -7 -29 29
(
1 -1 -1 -1 -1 -1 -1 -1
1
( -15 15 17 19 -15 -15 15 17 17
( -1
1
1
1 -1 -1
1
1
1
(
7 -35 -29 -7 -1 -1 -7 -29 29
(
1 -1 -1 -1 -1 -1 -1 -1
1
)
)
)
)
)
)
'
'
"
'
'
=
=
(
(
-1
1
1
-1
1
-1
1
-1
-1
-1
-1
-1
1
-1
1
-1
1
1
)
)
=
=
X2
Y2
(X3 , Y3 )
1 -1 1 -1 1 1 -1 1 1 )
Y3 = (
-1 1 -1 1 -1 -1 1 0 1 )
=
=
=
=
=
=
( -13 17 -1 13 -5 -5 13 -1
1
( -1
1 -1
1 -1 -1
1 -1
1
( 11 -11 27 -63 11 11 -11 27 27
(
1 -1
1 -1
1
1 -1
1
1
( -13 17 -1 13 -5 -5 13 -1
1
( -1
1 -1
1 -1 -1
1 -1
1
)
)
)
)
)
)
'
'
"
=
=
(
(
1
-1
-1
1
1
-1
-1
1
1
-1
1
-1
-1
1
1
0
1
1
)
)
=
=
X3
Y3
fo
.in
rs
de
ea
yr
.m
w
w
,w
ty
SC - Bidirectional hetero AM
Thus, the multiple training encoding strategy ensures the correct recall
kr
ab
or
M . The generalization of
the
ha
correlation matrix, for the correct recall of all training pairs, is written as
M =
qi Xi
Yi
real numbers.
i=1
This modified
correlation matrix
fo
.in
rs
de
ea
SC - Bidirectional hetero AM
or
ty
,w
.m
yr
ha
kr
ab
i
Algorithm Mul_Tr_Encode ( N , X
i
, Y
i )
, q
where
=
X
1,
X
2, . . . . ,
X
N)
X
Y
=
1 ,
Y
2, . . . . ,
Y
N)
Y
i = ( x i 1 ,x i
where X
, . . .x i n )
2
j = ( x j 1 , x j , . . . x j
where Y
n
2
M [0]
For i 1 to N
i ) ( X
i )
M M [ qi Transpose ( X
end
(symbols
Step 4 Compute
A_M where
A_M
ie
'
B
M
A
to
A_M
'
to get B
( A_M )
'
B
fo
.in
rs
de
ea
SC AM References
1. "Neural
ha
kr
ab
or
ty
,w
.m
yr
5. References : Textbooks
42