14 views

Uploaded by sk

assssss

- aristotle - on-264
- Make It Stick the Science of Successful Learning-P
- How the Eyes Betray Your Thoughts
- Study Skills Module
- Aimee Surprenant, Ian Neath-Principles of Memory.pdf
- Influences of Age and Divided Attention on Memory (Simplified)
- Check List Individual
- jurnal internasional
- psych 1010 final paper
- aziziyahAproachtoLearning.pdf
- Memory and Retention
- NN - Example Sheet 1
- Smith, Madan, & Weinstein-In press E-xcellence in Teaching.pdf
- Boost Your Memory
- Seven Study Techniques
- Gimnasia Cerebral
- April
- Theories of Learning
- US_How to study in Universities.pdf
- ProQuestDocuments-2012-10-20 (13)

You are on page 1of 42

.in

rs

de

ea

yr

http://www.myreaders.info/html/soft_computing.html

ab

or

ty

,w

.m

ha

kr

www.myreaders.info

Return to Website

Associative Memory

Soft Computing

Associative Memory (AM), topics : Description, content

address-

terms - encoding

and noise, performance measure - memory capacity and contentaddressability. Associative memory models : network architectures linear associator, Hopfield model and bi-directional model (BAM).

Auto-associative memory (auto-correlators) : how to store patterns ?

how to retrieve patterns? recognition of noisy patterns. Bi-directional

hetero-associative memory (hetero-correlators) : BAM operations retrieve the nearest pair, addition and deletion of pattern pairs,

energy function for BAM - working of Kosko's BAM, incorrect recall of

pattern, multiple training encoding strategy augmentation matrix,

generalized correlation matrix and algorithm.

fo

.in

rs

de

ea

,w

.m

yr

Associative Memory

ab

or

ty

Soft Computing

ha

kr

Topics

21, 22, 23, 24

4 hours)

Slides

(Lectures

03-12

AM related terms - encoding or memorization, retrieval or recollection,

errors and noise;

content-addressability.

13-20

Network architectures -

21-24

patterns.

25-41

pattern pairs; Energy function for BAM - working of Kosko's

BAM,

augmentation matrix, generalized correlation matrix and algorithm .

5. References

02

42

fo

.in

rs

de

ea

yr

ha

kr

ab

or

ty

,w

.m

Associative Memory

set of input patterns to a set of output patterns.

recall of data based on the degree of similarity between the input pattern

and the patterns stored in memory.

memory :

auto-associative

and

hetero-associative.

that

different from the input pattern not only in content but possibly also

in type and format.

to

implement

these

03

associative

memory

fo

.in

rs

de

ea

SC - AM description

ha

kr

ab

or

ty

,w

.m

yr

1. Associative Memory

An associative memory is a content-addressable structure that maps a

set of input patterns to a set of output patterns.

A content-addressable

structure

memory

where the

is accessed

and hetero-associative.

An auto-associative memory

In hetero-associative memory, the retrieved pattern is in general different

from the input pattern not only in content but possibly also in type

and format.

04

fo

.in

rs

de

ea

SC - AM description

or

ty

,w

.m

yr

An associative memory is a content-addressable structure that allows,

ha

kr

ab

The figure below shows a memory containing names of several people.

If the given memory is content-addressable,

Then using the erroneous string "Crhistpher Columbos" as key is

sufficient to retrieve the correct name "Christopher Colombus."

In this sense, this type of memory is robust and fault-tolerant, because

this type of memory exhibits some form of error-correction capability.

Thomas Edison

Christopher Columbus

Albert Einstein

Crhistpher Columbos

Charles Darwin

Christopher Columbus

Blaise Pascal

Marco Polo

Neil Armstrong

Sigmund Freud

Fig.

to an explicit address in the traditional computer memory system.

The memory allows the recall of information based on partial knowledge

of its contents.

[Continued in next slide]

05

fo

.in

rs

de

ea

yr

.m

w

w

,w

ty

SC - AM description

[Continued from previous slide]

kr

ab

or

ha

and

hetero-associative memory.

Auto-associative memory

y[3], . . . . . y[M],

pattern vectors

and

memory will output a pattern vector y(m)

Hetero-associative memory

{c(M), y(M)}.

have

Neural networks are used to implement associative memory models.

Linear

is

associater

the

simplest

artificial

neural

associative

memory.

Hopfield model

and

These

models

follow

memorize information.

06

different

neural

network

architectures

to

fo

.in

rs

de

ea

SC - AM description

Example

ab

or

ty

,w

.m

yr

ha

kr

When the storehouse is triggered or excited with a pattern, then

The input could be an exact or distorted or partial representation of

a stored pattern.

Fig below illustrates the working of an associated memory.

The associated pattern pairs

Recalled

Pattern The association is represented

Input

Pattern

( , ), ( , +), (7 , 4).

by the symbol

The associated pattern pairs

.

the associated pattern is retrieved automatically.

07

fo

.in

rs

de

ea

SC - AM description

or

ty

,w

.m

yr

As stated before, there are two classes of associative memory:

kr

ab

auto-associative and

ha

hetero-associative memory.

An

auto-associative correlator,

that

most

closely

A hetero-associative memory, also known as hetero-associative correlator,

is used to retrieve pattern in general, different from the input pattern

not only in content but possibly also different in type and format.

Examples

Input

pattern

presented

Recall of

associated

pattern

Hetero-associative memory

Presented

distorted

pattern

Recall of

perfect

pattern

Auto-associative memory

08

fo

.in

rs

de

ea

SC - AM description

or

ty

,w

.m

yr

Here explained : Encoding or memorization, Retrieval or recollection,

and

Content-addressability.

ha

kr

ab

Encoding or memorization

Building

an

associative

weight matrix W

memory

means,

constructing

connection

This process of constructing the connection weight matrix is called

encoding.

(wij)k = (xi)k (yj)k

, where

(xi)k

(yj)k

for

i = 1, 2, . . . , m

and

j = 1, 2, . . . , n.

is accomplished by

W=

k=1

Wk

where

09

fo

.in

rs

de

ea

SC - AM description

or

ty

,w

.m

yr

Retrieval or recollection

After memorization, the process of retrieving a stored pattern, given

ha

kr

ab

first compute

using

input j =

j=1

xi w i j

node j , for j = 1, 2, ..., n.

then determine the units output using a bipolar output function:

Yj =

+1 if input j j

-1

where

10

other wise

fo

.in

rs

de

ea

SC - AM description

or

ty

,w

.m

yr

The

input

pattern

may

contain

ha

kr

ab

When

corrupted

input

pattern

is

presented,

the

network

will

The presence of noise or errors results only in a mere decrease

rather

of

many

processing

distributed computations.

11

are

elements

robust

and

performing

fault

tolerant

highly

because

parallel

and

fo

.in

rs

de

ea

SC - AM description

or

ty

,w

.m

yr

Performance Measures

The

kr

ab

of

memory

associative

capacity

and

memory

content-addressability

performance

for

correct

are

the

measures

retrieval. These

two

ha

Memory

capacity

refers

to

the

maximum

number

of

associated

Content-addressability

is

to

retrieve

the

If input patterns are mutually orthogonal - perfect retrieval is possible.

If the stored input patterns are not mutually orthogonal - non-perfect

retrieval can happen due to crosstalk among the patterns.

12

fo

.in

rs

de

ea

SC - AM models

or

ty

,w

.m

yr

An associative memory is a system which stores mappings of specific input

to specific output representations.

ha

kr

ab

representations

Most

networks.

The simplest associative memory model is Linear associator, which is a

feed-forward

type

of

network.

It

has

very

low

memory

capacity

and

The popular models are

Hopfield Model

13

fo

.in

rs

de

ea

SC - AM models

or

ty

,w

.m

yr

The simplest and among the first studied associative memory models

kr

ab

ha

the

can be

but

it

possesses a very low memory capacity and therefore not much used.

The

are

Hopfield Model

and

The

that have a fairly complex collective computational abilities and

behavior.

time

until

the

system

becomes

stable.

Hopfield

networks

are

The Bi-directional associative memory (BAM) model

linear associator,

therefore

the

but

the

connections

are

is

similar to

bi-directional

and

layers.

The

BAM

model

can

perform

both

auto-associative

The network architecture

the next few slides.

14

of

these

three models

are

described

in

fo

.in

rs

de

ea

SC - AM models

or

ty

,w

.m

yr

The neural associative memory models

kr

ab

architectures

to

memorize

ha

The

Linear

associator

model,

network,

consists, two layers of processing units, one serving as the input layer

while the other as the output layer.

The

than itself.

The

is similar to

In this section, the neural network architectures of these models and

the construction of the corresponding connection weight matrix W of

15

fo

.in

rs

de

ea

SC - AM models

or

ty

,w

.m

yr

It is a feed-forward type network where the output is produced in a

kr

ab

ha

of processing units, one serving as the input layer while the other as

via a series of weights.

to every output.

of the linear associator is as shown below.

neurons

weights wij

w11

x1

y1

w21

w12

w22

x2

y2

w2m

inputs

outputs

w1m

wn1

wn2

Xn

Ym

wnm

Fig. Linear associator model

all n input units are connected to all m output units via connection

weight matrix

of

W = [wij]n x m

th

th

output unit.

the

connection

pattern

building

weight

matrix

stores

the

different

associated

an

associative

weight matrix W

memory

is

constructing

the

connection

then the stored pattern associated with the input pattern is retrieved.

[Continued in next slide]

16

fo

.in

rs

de

ea

yr

.m

w

w

,w

ty

SC - AM models

Encoding : The process of constructing the connection weight matrix

kr

ab

or

is called encoding.

ha

matrix Wk for an associated pattern pair (Xk, Yk) are computed as:

(xi)k is the i

th

where

i = 1, 2, ..., m,

j = 1, 2, ..., n.

Weight matrix :

and

is accomplished

where

Wk

k=1

normalizing, usually

Decoding : After memorization, the network can be used for retrieval;

the

process

of

retrieving

stored

pattern,

is

called

decoding;

xi w i j where

j=1

input j stands for the weighted sum of the input or activation value of

and then determine the units Output using a bipolar output function:

+1 if input j j

Yj =

-1

where

other wise

Note: The output units behave like linear threshold units; that compute

a weighted sum of the input and produces a -1 or +1 depending

whether the weighted sum is below or above a certain threshold value.

Performance :

corrupt input pattern is presented, the network will retrieve the stored

pattern that is closest to actual input pattern. Therefore, the linear

associator is robust and fault tolerant. The presence of noise or error

results in a mere decrease rather than total degradation in the

performance of the network.

17

fo

.in

rs

de

ea

SC - AM models

or

ty

,w

.m

yr

Auto-associative

kr

ab

pattern

pairs,

memory

are

means

stored

in

patterns

memory.

rather

Hopfield

than

model

associated

is

one-layer

ha

Hopfield network

W24

W14

W13

W23

W43

W42

W31

W21

W41

2

I1

I2

I3

V3

V2

outputs V

V1

W34

W32

W12

connection

weights wij

alternate view

neurons

I4

inputs I

V4

the model consists, a single layer of processing elements where each

unit is connected to every other unit in the network but not to itself.

connection

by the weight matrix W which is square and symmetric, ie, w

ij

=wji

for i, j = 1, 2, . . . . . , m.

each

m

input j =

xi w i j + I j

i=1

for j = 1, 2, . . ., m.

each unit acts as both input and output unit. Like linear associator,

matrix as Wk =

18

X k Yk

where XK = YK

fo

.in

rs

de

ea

yr

ty

,w

.m

SC - AM models

[Continued from previous slide]

Weight Matrix :

is accomplished by

kr

ab

or

where

Wk

k=1

ha

to store

the Hopfield

model is

Decoding : After memorization, the network can be used for retrieval;

input pattern X, the decoding

or

retrieving

is

accomplished

xi w i j where input j

j=1

stands for the weighted sum of the input or activation value of node

for j = 1, 2, ..., n. and xi is the i

th

by

j,

+1 if input j j

Yj =

-1

where

other wise

that

compute a

weighted sum is below or above a certain threshold value.

relaxation search for a stored pattern given an initial stimulus pattern.

Given an input pattern X, decoding is accomplished by computing the

net input to the units and determining the output of those units using

the output function to produce the pattern X'. The pattern X' is then fed

back to the units as an input pattern to produce the pattern X''. The

pattern X'' is again fed back to the units to produce the pattern X'''.

The process is repeated until the network stabilizes on a stored pattern

where further computations do not change the output of the units.

In the next section, the working of an auto-correlator :

patterns, recall a pattern from the stored patterns

recognize a noisy pattern are explained.

19

how to store

and

how to

fo

.in

rs

de

ea

SC - AM models

or

ty

,w

.m

yr

Kosko (1988) extended

kr

ab

by

incorporating

an

ha

auto-associations

memories.

as

The

the

Hopfield

additional

well

network

as

layer

to

perform

hetero-associations

structure

of

the

on

recurrent

the

bidirectional

stored

associative

connections are bidirectional; i.e.,

wij = wji ,

for i = 1, 2, . . . , n

neurons

and

j = 1, 2, . . . , m.

weights wij

neurons

w11

x1

y1

w21

w12

w22

x2

y2

w2m

inputs

w1m

wn1

outputs

wn2

Xn

Ym

wnm

In the bidirectional associative memory, a single associated pattern

the

different associated

pattern

pairs

simultaneously,

is

i.e., W =

where

20

X k Yk .

k=1

Wk

accomplished

fo

.in

rs

de

ea

SC - AM

auto correlator

or

ty

,w

.m

yr

ha

kr

ab

rather

than associated

pattern pairs,

are

stored

in memory.

In this

illustrated using some examples.

Working of an auto-correlator :

how to store the patterns,

how to retrieve / recall a pattern from the stored patterns,

how to recognize a noisy pattern

21

and

is

fo

.in

rs

de

ea

SC - AM

auto correlator

or

ty

,w

.m

yr

Consider the three bipolar patterns

A1 , A2, A3

to be stored as

kr

ab

an auto-correlator.

ha

A1 = (-1, 1 , -1 , 1 )

A2 = ( 1, 1 ,

1 , -1 )

A3 = (-1, -1 , -1 , 1 )

Note that the outer product of two vectors U

U1

U2

U3

U4

= UT V =

V1 V2 V3

and

U1V1

U2V1

U3V1

U4V1

V is

U1V2

U2V2

U3V2

U4V2

U1V3

U2V3

U3V3

U4V3

A1 , A2, A3

patterns are

j

T

=

i

1 -1 1 -1

-1 1 -1 1

1 -1 1 -1

-1 1 -1 1

j

T

1 1 1 -1

1 1 1 -1

1 1 1 -1

-1 -1 -1 1

j

T

=

i

T = [t i j ] =

i=1

1 1 1 -1

1 1 1 -1

1 1 1 -1

-1 -1 -1 1

is

=

22

3 1 3 -3

1 3 1 -1

3 1 3 -3

-3 -1 -3 3

bipolar

fo

.in

rs

de

ea

SC - AM

auto correlator

or

ty

,w

.m

yr

The

previous

slide

shows

the

connection

matrix

ha

C

T = [t i j ] =

i=1

the

three

kr

ab

of

3 1 3 -3

1 3 1 -1

3 1 3 -3

-3 -1 -3 3

j

ai

Retrieve or recall of this pattern A2 from the three stored patterns.

The recall equation is

new

aj

old

= (ai t i j , aj

for j = 1 , 2 , . . . , p

= ai t i j

and

then find

i=

= ai

= ai

= ai

= ai

t i , j=1

t i , j=2

t i , j=3

t i , j=4

1x3

1x1

1x3

1x-3

new

Therefore

aj

new

a1

new

a2

new

a3

new

a4

+

+

+

+

1x1

1x3

1x1

1x-1

+

+

+

+

1x3

1x1

1x3

1x-3

4

+

+

+

+

-1x-3

-1x-1

-1x-3

-1x3

= 10

=

= 10

= -1

1

1

1

-1

old

= (ai t i j

, aj

) for j = 1 , 2 , . . . , p

is ( , )

= (10 , 1)

= (6 ,

1)

= (10 , 1)

= (-1 , -1)

This is how to retrieve or recall a pattern from the stored patterns.

Similarly, retrieval of vector pattern A3 as

new

( a1

23

new

, a2

new

, a3

new

a 4 , ) = ( -1, -1 , -1 , 1 ) = A3

fo

.in

rs

de

ea

SC - AM

auto correlator

or

ty

,w

.m

yr

Consider a vector

A'

= ( 1,

1 ,

1 , 1 )

kr

ab

ha

find

the

proximity

of

the

noisy

vector

to

the

stored

patterns

note that the Hamming distance (HD) of a vector X from Y,

X = (x1 , x2 , . . . , xn)

m

HD (x , y) =

where

| (xi - yi ) |

i=1

The HDs of A' from each of the stored patterns A1 , A2, A3 are

HD (A' , A1)

= |(x1 - y1 )|,

= 4

HD (A' , A2)

= 2

HD (A' , A3)

= 6

In other words the vector A' is a noisy version of vector A2.

Computation of recall equation using vector A' yields :

i=

= ai

= ai

= ai

= ai

t i , j=1

t i , j=2

t i , j=3

t i , j=4

1

1x3

1x1

1x3

1x-3

new

Therefore

aj

new

a1

new

a2

new

a3

new

a4

2

1x1

1x3

1x1

1x-1

+

+

+

+

+

+

+

+

3

1x3

1x1

1x3

1x-3

+

+

+

+

4

1x-3

1x-1

1x-3

1x3

= 4

= 4

= 4

= -4

1

1

1

-1

old

= (ai t i j

, aj

) for j = 1 , 2 , . . . , p

is ( , )

= (4 , 1)

= (4 , 1)

= (4 , 1)

= (-4 , -1)

is A2 .

an autocorrelator

24

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

or

ty

,w

.m

yr

The Hopfield one-layer unidirectional auto-associators have been discussed

ha

kr

ab

in

previous

section.

Kosko

(1987)

extended

this

network

to

two-layer

can achieve hetero-association. The important performance attributes of the

BAM is its ability to recall stored pairs particularly in the presence of noise.

Definition : If the associated pattern pairs (X, Y) are different

and if the

model

then

recalls

pattern Y

given a

pattern

or vice-versa,

it

This section illustrates the bidirectional associative memory :

Energy Function (Kosko's correlation matrix, incorrect recall of pattern),

Multiple training encoding strategy (Wang's generalized correlation matrix).

25

is

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

or

ty

,w

.m

yr

BAM is a two-layer nonlinear neural network.

kr

ab

with

elements Ai

and

the

other layer

ha

The basic coding procedure of the discrete BAM is as follows.

Consider N training pairs { (A1 , B1) , (A2 , B2), . . , (Ai , Bi), . . (AN , BN) }

where Ai = (ai1 , ai2 , . . . , ain) and Bi = (bi1 , bi2 , . . . , bip) and

aij , bij are either in ON or OFF state.

in binary mode ,

in bipolar mode,

ON = 1 and OFF =

ON = 1 and OFF

and

= -1

M0 =

i=1

[ Xi ] [ Yi ]

and xij(yij) is the bipolar form of aij(bij)

The energy function E for the pair ( , ) and correlation matrix M is

E= -M

to retrieve

26

the operations

deletion of

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

ty

,w

.m

yr

(ref : previous slide)

ab

or

Example

ha

kr

start with an initial condition which is any given pattern pair ( , ),

determine a finite sequence of pattern pairs (' , ' ) , (" , " ) . . . .

T

' = ( M )

" = (' M )

and " = ( '' M

)

T

(F) = G = g1 , g2 , . . . . , gr ,

F = ( f1 , f2 , . . . . , fr )

M

is correlation matrix

1

if

fi >0

0 (binary)

gi =

fi < 0

previous g i ,

fi = 0

-1 (bipolar)

Kosko

has

proved

correlation matrix M.

27

that

this

process

will

converge

for

any

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

or

ty

,w

.m

yr

Given a set of pattern pairs (Xi , Yi) , for i = 1 , 2, . . . , n

and a set

kr

ab

of correlation matrix M :

ha

or

an existing pair (Xj , Yj) can be deleted from the memory model.

Mnew

= X1 Y1

+ X1 Y1 + . . . . + Xn Yn

+ X'

Y'

from the correlation matrix M , them the new correlation matrix Mnew

is given by

Mnew = M

Note :

The addition

functioning of

Yj )

the system

and forgetfulness.

28

( Xj

as a human

memory

similar to the

exhibiting

learning

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

or

ty

,w

.m

yr

Note : A system that changes with time is a dynamic system. There are two types

kr

ab

ha

the system e.g., weights, data flows. The Energy function (or Lyapunov function)

is a bounded function of

the

to store a pattern, the value of the energy function for that pattern

also

adding

new

patterns

must

not

destroy

the

previously

stored patterns.

The stability of a BAM can be proved by identifying the energy function E

with each state (A , B) .

For auto-associative memory : the energy function is

E(A)

= - AM A

E(A, B) = - AM B

it corresponds

We wish to retrieve the nearest of (Ai , Bi) pair, when any ( , ) pair

is

presented

as

initial

condition

to

BAM.

The

neurons

change

their states until a bidirectional stable state (Af , Bf) is reached. Kosko

has shown that such stable state is reached for any matrix M

corresponds to local minimum of the energy function.

decoding

lowers

the

energy

( , ) is given by E = M

If the energy

(Ai , Bi)

E = Ai M Bi

if

the

energy

when it

Each cycle of

be recalled, even though one starts with = Ai. Thus Kosko's encoding

method does not ensure that the stored pairs are at a local minimum.

29

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

or

ty

,w

.m

yr

The working of Kosko's BAM for retrieval of associated

pair.

ha

kr

ab

A1 =

B1 =

A2 =

B2 =

A3 =

B3 =

X1 =

1 -1 -1 -1 -1

X2 =

( -1

X3 =

( -1 -1

Y1 =

1 -1 -1 -1

Y2 =

1 -1

1 -1

Y3 =

( -1

1 -1 -1 -1

1

1 -1 -1

1 -1

1

1 -3 -1

1 -3

M =

X1 Y1 +

X2 Y2 +

X3 Y3

-1 -1

1 -1

3

1

1

1 -1

-1 -1 -1

-3

-1

3 -1

1 -1

Suppose we start with = X3, and we hope to retrieve the associated pair

Y3 .

M = ( -1 -1 1 -1 1 1 ) ( M ) = ( -6 6 6 6 -6 )

( M) = ' = ( -1 1 1 1 -1 )

' M T = ( -5 -5 5 -3 7 5 )

(' M T) = ( -1 -1 1 -1 1 1 ) = '

' M = ( -1 -1 1 -1 1 1 ) M = ( -6 6 6 6 -6 )

(' M) = " =

( -1 1 1 1 1 -1 )

= '

Hence,

30

(f , f) = (X3 , Y3 )

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

or

ty

,w

.m

yr

The Working of incorrect recall by Kosko's BAM.

kr

ab

ha

A1 =

( 1 0 0 1 1 1 0 0 0 )

B1 =

1 1 1 0 0 0 0 1 0

A2 =

( 0 1 1 1 0 0 1 1 1 )

B2 =

1 0 0 0 0 0 0 0 1

A3 =

( 1 0 1 0 1 1 0 1 1 )

B3 =

0 1 0 1 0 0 1 0 1

X1 =

( 1 -1 -1 1 1 1 -1 -1 -1 )

Y1 =

1 1 1 -1 -1 -1 -1 1 -1 )

X2 =

( -1 1 1 1 -1 -1 1 1 1 )

Y2 =

1 -1 -1 -1 -1 -1 -1 -1 1

X3 =

( 1 -1 1 -1 1 1 -1 1 1 )

Y3 =

( -1 1 -1 1 -1 -1 1 0 1

T

M = X1 Y1

-1

1

-1

3

-1

-1

1

-1

-1

+

3

-3

-1

-1

3

3

-3

-1

-1

31

X2 Y2 +

1

-1

-3

1

1

1

-1

-3

-3

1

-1

1

-3

1

1

-1

1

1

-1

1

-1

-1

-1

-1

1

-1

-1

X3 Y3

-1

1

-1

-1

-1

-1

1

-1

-1

1

-1

1

-3

1

1

-1

1

1

1

-1

-3

1

1

1

-1

-3

-3

-1

1

3

-1

-1

-1

1

3

3

fo

.in

rs

de

ea

yr

ty

,w

.m

SC - Bidirectional hetero AM

( -1 1 1 1 -1 -1 1 1 1 )

Y2 =

1 -1 -1 -1 -1 -1 -1 -1 1

ha

kr

ab

or

X2 =

The calculations for the retrieval of Y2 yield :

M = (

( M) = (

-19 -13 -5

-5 -13 13

-1

-1

-11 -11 11

'

'

"

-1

-1

-1

' M T = ( -11 11

-1

-1

-19 -13 -5

-5 -13 13

-1

-1

(' M ) = (

-1

' M = (

(' M) = (

-1

-1

-1

= '

F = ' =

-1

-1

-1

F = ' =

-1

-1

-1

-1

-1

Y2

X2

But ' is not Y2 . Thus the vector pair (X2 , Y2) is not recalled correctly

by Kosko's decoding process.

( Continued in next slide )

32

fo

.in

rs

de

ea

yr

.m

w

w

ty

,w

SC - Bidirectional hetero AM

or

ha

kr

ab

However, the coordinates of pair

(X2

Y2)

is "one Hamming distance"

Y2'

E

at a point which

-1 -1 -1

-1 -1 -1

E = - X2 M Y2

= - 73

which is lower than E2 confirming the hypothesis that (X2 , Y2) is not

at its local minimum of E.

Note : The correlation matrix M used by Kosko does not

that

of the energy surface.

33

guarantee

if

and only if

this

pair

is at a local minimum

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

or

ty

,w

.m

yr

Note : (Ref. example in previous section).

Kosko extended

the unidirectional

kr

ab

ha

M=

Xi

retrieve

the

nearest

pair

equations. However, Kosko's encoding method does not ensure that the stored

pairs are at local minimum and hence, results in incorrect recall.

ensures the correct

which

generalized correlation matrix is M = qi

as

pair weight

for

T

Xi

Yi

as

Yi

Xi

where qi

is viewed

It denotes the

minimum number of times for using a pattern pair (Xi , Yi) for training to

guarantee recall of that pair.

To recover a pair (Ai , Bi)

let us

P = (q 1)

Xi

Yi

the existing correlation matrix. As a result the energy E' can reduced to

an arbitrarily low value by a suitable choice of q. This also ensures that

the energy at (Ai , Bi) does not exceed at points which are one Hamming

distance away from this location.

The new value of the energy function E evaluated at (Ai , Bi) then becomes

T

The

next

few

slides

(q 1) Ai Xi

explains

Yi Bi

Multiple training encoding strategy for the recall of three pattern pairs

(X1 , Y1 ) , (X1 , Y1 ) , (X1 , Y1 ) using one and same augmentation matrix

M . Also an algorithm to summarize the complete

34

process of multiple

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

or

ty

,w

.m

yr

The working of multiple training encoding strategy which ensures the

ha

kr

ab

A1 =

( 1 0 0 1 1 1 0 0 0 )

B1 =

1 1 1 0 0 0 0 1 0

A2 =

( 0 1 1 1 0 0 1 1 1 )

B2 =

1 0 0 0 0 0 0 0 1

A3 =

( 1 0 1 0 1 1 0 1 1 )

B3 =

0 1 0 1 0 0 1 0 1

X1 =

( 1 -1 -1 1 1 1 -1 -1 -1 )

Y1 =

1 1 1 -1 -1 -1 -1 1 -1 )

X2 =

( -1 1 1 1 -1 -1 1 1 1 )

Y2 =

1 -1 -1 -1 -1 -1 -1 -1 1

X3 =

( 1 -1 1 -1 1 1 -1 1 1 )

Y3 =

( -1 1 -1 1 -1 -1 1 0 1

Y2 =

X2 =

( -1 1 1 1 -1 -1 1 1 1 )

1 -1 -1 -1 -1 -1 -1 -1 1

becomes

M = X1 Y1

2

0

4

-2

-2

2

0

0

4

-4

-2

-2

4

4

-4

-2

-2

+ 2 X2 Y2 +

2

-2

-4

0

2

2

-2

-4

-4

35

2

-2

0

-4

2

2

-2

0

0

0

0

-2

-2

0

0

0

-2

-2

0

0

-2

-2

0

0

0

-2

-2

X3 Y3

2

-2

0

-4

2

2

-2

0

0

2

-2

-4

0

2

2

-2

-4

-4

-2

2

4

0

-2

-2

2

4

4

fo

.in

rs

de

ea

yr

.m

w

w

,w

ty

SC - Bidirectional hetero AM

ha

kr

ab

or

( M) = (

-1

-1

-1

-1

-8 -14 -22 22

-1

-1

-1

(' M T) = (

-1

-1

(' M) = (

Here

Thus,

-1

-1

-1

-8 -14 -22 23

1

-1

-1

'

'

)

)

"

F = ' =

-1

-1

-1

F = ' =

-1

-1

-1

-1

-1

= Y2

(X2

matrix M .

Y2 )

is correctly

recalled,

using

augmented

X2

correlation

( Continued in next slide )

36

-1

fo

.in

rs

de

ea

yr

.m

w

w

,w

ty

SC - Bidirectional hetero AM

Note : The previous slide showed that the pattern pair (X2 , Y2 ) is correctly

ha

kr

ab

or

T

= X1 Y1 + 2 X2 Y2 + X3 Y3

but

then

the

same

matrix

can not

recall

correctly

the

other

X1 =

( 1 -1 -1 1 1 1 -1 -1 -1 )

Y1 =

1 1 1 -1 -1 -1 -1 1 -1 )

M = (

-6

( M) = (

-1

24 22

1

22 -22

1

-1

)

)

'

'

(' M T) = (

-1

-1

' M = ( -14 28 22 14

14 22 -22

-1

"

(' M) = (

-1

-1

-1

-1

-1

F = ' =

-1

-1

-1

-1

-1

-1

F = ' =

-1

-1

Y1

X1

Thus, the pattern pair (X1 , Y1 ) is not correctly recalled, using augmented

correlation matrix M.

To tackle this problem, the

augmented

matrix

needs to be further

37

correlation

fo

.in

rs

de

ea

yr

.m

w

w

,w

ty

SC - Bidirectional hetero AM

Y1 )

cannot be recalled

ha

kr

ab

or

under the same augmentation matrix M that is able to recall (X2 , Y2).

problem can be solved

by multiple training

of (X1

Y 1)

However, this

T

M = 2 X1 Y1 + 2 X2 Y2 + X3 Y3

-1

1

-1

5

-1

-1

1

-1

-1

5

-5

-3

-1

5

5

-5

-3

-3

3

-3

-5

1

3

3

-3

-5

-5

1

-1

1

-5

1

1

-1

1

1

-1

1

-1

-3

-1

-1

1

-1

-1

-1

1

-1

-3

-1

-1

1

-1

-1

1

-1

1

-5

1

1

-1

1

1

3

-3

-5

1

3

3

-3

-5

-5

-3

3

5

-1

-3

-3

3

5

5

Now observe in the next slide that all three pairs can be correctly recalled.

( Continued in next slide )

38

fo

.in

rs

de

ea

yr

.m

w

w

,w

ty

or

ab

SC - Bidirectional hetero AM

X1 =

(X1 , Y1 )

1 -1 -1 1 1 1 -1 -1 -1 )

Y1 =

1 1 1 -1 -1 -1 -1 1 -1 )

ha

kr

M

( M)

(' M T )

(' M T )

' M

(' M)

=

=

=

=

=

=

(

3 33 31 -3 -5 -5 -3 31 -31

(

1

1

1 -1 -1 -1 -1

1 -1

( 13 -13 -19 23 13 13 -13 -19 -19

(

1 -1 -1

1

1

1 -1 -1 -1

(

3 33 31 -3 -5 -5 -3 31 -31

(

1

1

1 -1 -1 -1 -1

1 -1

)

)

)

)

)

)

'

'

"

-1

-1

)

)

F =

F =

'

'

=

=

(

(

1

1

-1

1

-1

1

1

-1

1

-1

1

-1

-1

-1

-1

1

=

=

X1

Y1

Recall of pattern pair

X2 =

(X2 , Y2 )

( -1 1 1 1 -1 -1 1 1 1 )

Y2 = (

1 -1 -1 -1 -1 -1 -1 -1 1 )

M

( M)

(' M T )

(' M T )

' M

(' M)

=

=

=

=

=

=

(

7 -35 -29 -7 -1 -1 -7 -29 29

(

1 -1 -1 -1 -1 -1 -1 -1

1

( -15 15 17 19 -15 -15 15 17 17

( -1

1

1

1 -1 -1

1

1

1

(

7 -35 -29 -7 -1 -1 -7 -29 29

(

1 -1 -1 -1 -1 -1 -1 -1

1

)

)

)

)

)

)

'

'

"

F =

F =

'

'

=

=

(

(

-1

1

1

-1

1

-1

1

-1

-1

-1

-1

-1

1

-1

1

-1

1

1

)

)

=

=

X2

Y2

Recall of pattern pair

X3 = (

(X3 , Y3 )

1 -1 1 -1 1 1 -1 1 1 )

Y3 = (

-1 1 -1 1 -1 -1 1 0 1 )

M

( M)

(' M T )

(' M T )

' M

(' M)

=

=

=

=

=

=

( -13 17 -1 13 -5 -5 13 -1

1

( -1

1 -1

1 -1 -1

1 -1

1

( 11 -11 27 -63 11 11 -11 27 27

(

1 -1

1 -1

1

1 -1

1

1

( -13 17 -1 13 -5 -5 13 -1

1

( -1

1 -1

1 -1 -1

1 -1

1

)

)

)

)

)

)

'

'

"

F = '

F = '

=

=

(

(

1

-1

-1

1

1

-1

-1

1

1

-1

1

-1

( Continued in next slide )

39

-1

1

1

0

1

1

)

)

=

=

X3

Y3

fo

.in

rs

de

ea

yr

.m

w

w

,w

ty

SC - Bidirectional hetero AM

Thus, the multiple training encoding strategy ensures the correct recall

kr

ab

or

M . The generalization of

the

ha

correlation matrix, for the correct recall of all training pairs, is written as

M =

qi Xi

Yi

real numbers.

i=1

This modified

correlation matrix

the training pattern pairs .

40

fo

.in

rs

de

ea

SC - Bidirectional hetero AM

or

ty

,w

.m

yr

process of multiple training encoding an

ha

kr

ab

i

Algorithm Mul_Tr_Encode ( N , X

i

, Y

i )

, q

where

i

X

,

Y

=

X

1,

X

2, . . . . ,

X

N)

X

Y

=

1 ,

Y

2, . . . . ,

Y

N)

Y

i = ( x i 1 ,x i

where X

, . . .x i n )

2

j = ( x j 1 , x j , . . . x j

where Y

n

2

M [0]

For i 1 to N

i ) ( X

i )

M M [ qi Transpose ( X

end

scalar multiplication)

(symbols

Step 4 Compute

A_M where

A_M

ie

'

B

M

A

to

A_M

'

to get B

( A_M )

Step 6 Output

end

41

'

B

fo

.in

rs

de

ea

SC AM References

1. "Neural

Applications", by S. Rajasekaran and G.A. Vijayalaksmi Pai, (2005), Prentice Hall,

Chapter 4, page 87-116.

ha

kr

ab

or

ty

,w

.m

yr

5. References : Textbooks

and Sanjay Ranka, (1996), MIT Press, Chapter 6, page 217-263.

Laurene V. Fausett, (1993), Prentice Hall, Chapter 3, page 101-152.

Hudson Beale, ( 1996) , PWS Publ. Company, Chapter 13, page 13-1 to 13-37.

Chapter 6-7, page 143-208.

being prepared for inclusion at a later date.

42

- aristotle - on-264Uploaded byDavid Jimenez
- Make It Stick the Science of Successful Learning-PUploaded byAlexandra Balasoiu
- How the Eyes Betray Your ThoughtsUploaded byDrBertram Forer
- Study Skills ModuleUploaded bymohd.aqil
- Aimee Surprenant, Ian Neath-Principles of Memory.pdfUploaded byJani Szabo
- Influences of Age and Divided Attention on Memory (Simplified)Uploaded byPooyan B.
- Check List IndividualUploaded byDanyaLavariegaSalinas
- jurnal internasionalUploaded byMuhammad Gibran Rambe
- psych 1010 final paperUploaded byapi-272876382
- aziziyahAproachtoLearning.pdfUploaded byimadudin20076707
- Memory and RetentionUploaded byKrystal Yvez Beverino Cruz
- NN - Example Sheet 1Uploaded byJon Arnold Grey
- Smith, Madan, & Weinstein-In press E-xcellence in Teaching.pdfUploaded byjayroldparcede
- Boost Your MemoryUploaded byMuhammad Iqrash Awan
- Seven Study TechniquesUploaded byWelkom Biologyproject
- Gimnasia CerebralUploaded byAdalena De León
- AprilUploaded bySalina Jamaluddin
- Theories of LearningUploaded byUzaina Qazi
- US_How to study in Universities.pdfUploaded byPhúc Plus Plus
- ProQuestDocuments-2012-10-20 (13)Uploaded byIoana-Laura Luca
- Understanding Working Memory for Improving LearningUploaded byATS
- TextUploaded byPranav Mohla
- 30. Format. Hum - An Empirical Research on Multi-Disciplinary Learning Capability Through MnemonicsUploaded byImpact Journals
- Chapter 15 ConclusionsUploaded bythao_t
- What is Sensory Memory_ Definition and ExamplesUploaded byWilson OTO
- Using Imagery Rescripting to Treat Major Depression, Wheatley and HackmannUploaded byelliot
- Effects of Using MnemonicsUploaded byWinda Rahmadani
- Memory TechniquesUploaded byJeyakumar Raja
- Information, Emotional Arousal and the EcologicalUploaded byPersephona13
- tron-itoUploaded byJuan Mera

- DATRON CNC-Machine Catalogue Prosp EnUploaded byad
- 2658Uploaded byrohanmanimani
- Oracle SQL Queries on Emp Table 1 to 235Uploaded byypraju
- ErrorUploaded by.doc
- Triple Jump TechniqueUploaded bymerrygut
- Work Breakdown Structures MiscUploaded byjuliosiete
- Fuel Injection and Engine ManagementUploaded bydamyandrei
- Air Distribution-part 2 CarrierUploaded bycamaleon86
- GOHFER SPE Paper Reference.pdfUploaded byMiguel Vidal Arango
- Remove Hidden & System Attrib Frm DirectoriesUploaded byhcr2010
- hiUploaded byAshishMahajan
- GAClipExtreme Manual(D2139 9 en)Uploaded bybabuer
- EmulsionsUploaded byWalaa Elwekel
- YaleUploaded byAnonymous aGwpVF
- 如何更新和配置ec2087Uploaded byTTIBCCA
- PS14_2014Uploaded byJennifer Gilmore
- INSTALL MANUAL Indoor and Outdoor Medium Voltage Metal-Clad Switchgear Outdoor Power Control EnclosureUploaded byBudi Santony
- Edr Designguidelines Hvac Simulation 2edUploaded byDavid O. Almeida
- Siting and Hydraulics of Wastewater Treatment PlantsUploaded byisquare77
- Paint Brochure EnglishUploaded byNadeem Ahmad
- 01 - LEED GA Study Guide.pdfUploaded byFrancisco M. Ramos
- procurementUploaded byYasith Weerasinghe
- Octave ManualUploaded bycmtinv
- LN PGBU 2 TheImportanceofEffectiveMotorandMotorCircuitProtectionUploaded byEikonikos Eikon
- CTA TehnicUploaded byLar Ionut Andrei
- Abdul_new Resume - CopyUploaded bySaravanan Kumar P
- ASTM_D395.pdfUploaded byümit özkan
- Research Proposal Setup GM_VianaUploaded byRendel Rosalia
- Ivan Ozimic CVUploaded byIvan
- Module 1- Electrode potential & cellsUploaded byrashmi