Beruflich Dokumente
Kultur Dokumente
Introduction
Static Memories
Neural Networks
Lecture 5: Associative Memory
H.A. Talebi
Farzaneh Abdollahi
Winter 2011
H. A. Talebi, Farzaneh Abdollahi
Neural Networks
Lecture 5
1/13
Outline
Introduction
Static Memories
Introduction
Static Memories
Neural Networks
Lecture 5
2/13
Outline
Introduction
Static Memories
Introduction
I
Neural Networks
Lecture 5
3/13
Outline
Introduction
Static Memories
Neural Networks
Lecture 5
4/13
Outline
Introduction
Static Memories
The purpose of search is to output one or all stored items that matches
the search argument and retrieve it entirely or partially
Neural Networks
Lecture 5
5/13
Outline
Introduction
Static Memories
v is available at output at a later time than the input has been applied
Neural Networks
Lecture 5
6/13
Outline
Introduction
Static Memories
Neural Networks
Lecture 5
7/13
Outline
Introduction
I
I
I
Static Memories
The words are accessed based on the content of the key vector
When the network is excited by a portion of the stored date, the
efficient response of autoassociative memory is the completed x (i)
vector
In hetroassociative memory the content of x (i) provides the stored
vector v (i)
There is no storage for prototype x (i) or v (i) at any location of network
The entire mapping is distributed in the network.
The mapping is implemented through dense connections, feedback
or/and a nonlinear thresholding operation
Neural Networks
Lecture 5
8/13
Outline
Introduction
Static Memories
Static memory
I
I
I
Dynamic memory
I
v k+1 = M2 [v k ]
H. A. Talebi, Farzaneh Abdollahi
Neural Networks
Lecture 5
9/13
Outline
Introduction
Static Memories
Neural Networks
Lecture 5
10/13
Outline
Introduction
Static Memories
Linear Associator
I
Neural Networks
Lecture 5
11/13
Outline
Introduction
Static Memories
I
I
where fi , sj are ith and jth elements of vector f and s respectively and
wij is the weight between them before update.
To generalize the formula: W 0 = W + fsT , where W : the weight
matrix before update.
Initializing the weights in their unbiased position W0 = 0 W 0 = fsT
Since there
Pp are p pairs of patterns, the superposition of weights:
W 0 = i=1 f(i) s(i)T
The memory weight matrix W 0 above has the form of a
cross-correlation matrix.
Neural Networks
Lecture 5
12/13
Outline
Introduction
Static Memories
Consider one of pthe stored vectors, s (j) as key vector at the input:
X
v = (
f(i) s(i)T )s(j)
= f
i=1
(1) (1)T (j)
This is a strict condition and may not always hold for all sets of
vectors.
Neural Networks
Lecture 5
13/13