Beruflich Dokumente
Kultur Dokumente
HOP
Associative Memories
NN 5 1
Example HOP
Neural Networks 1
NN 5
HOP
HOPFIELD NETWORKS
• The Hopfield network implements a so-called content
addressable memory.
• A collection of patterns called fundamental memories is
stored in the NN by means of weights.
• Each neuron represents a component of the input.
• The weight of the link between two neurons measures
the correlation between the two corresponding
components over the fundamental memories. If the
weight is high then the corresponding components are
often equal in the fundamental memories.
NN 5 3
HOP
ARCHITECTURE: recurrent
z-1
z-1
unit-delay operator
NN 5 4
Neural Networks 2
NN 5
HOP
Hopfield discrete NN
• Input vectors values are in {-1,1} (or {0,1}).
• The number of neurons is equal to the input dimension.
• Every neuron has a link from every other neuron (recurrent
architecture) except itself (no self-feedback).
• The neuron state at time n is its output value.
• The network state at time n is the vector of neurons states.
• The activation function used to update a neuron state is the
sign function but if the input of the activation function is 0
then the new output (state) of the neuron is equal to the old
one.
• Weights are symmetric: w = w
ij ji
NN 5 5
HOP
Notation
• N: input dimension.
• M: number of fundamental memories.
• f µ i i-th component of the µ fundamental
memory.
• xi (n ) State of neuron i at time n.
NN 5 6
Neural Networks 3
NN 5
HOP
Weights computation
1 M
w ji = M
∑
µ
fµ
=1
,i fµ, j j ≠ i, j = 1.......N
0 j =i
where wji is the weight from neuron i to neuron j. The elements
of the vectors fµ are in {-1,+1}. Once computed, the synaptic
weights are fixed.
NN 5 7
HOP
NN Execution
2. Initialisation. Let x probe denote an input vector (probe) presented to the
network. The algorithm is initialised by setting:
x j (0) = x probe , j j = 1, ... , N
where xj(0) is the state of neuron j at time n = 0, and x probe,j is the j-th
element of the probe vector x probe.
N
x j (n + 1) = sign ∑ w ji xi (n ) j = 1, 2, ... , N
i =1
Repeat the iteration until the state vector x remains unchanged.
4. Outputting. Let x fixed denote the fixed point (or stable state, that is such
that x(n+1)=x(n)) computed at the end of step 3. The resulting output y of the
network is:
y = x fixed
NN 5 8
Neural Networks 4
NN 5
HOP
Example 1
(-1, 1, 1) (1, 1, 1)
weight
1
(1, 1, -1)
+
- (-1, 1, -1)
+
-
-
2 3
(-1, -1, 1)
-
(1, -1, 1)
neuron
-1 -1 -1 1 1 -1 -1 1 1
attraction basin 1 attraction basin 2
1 -1 -1 -1 -1 1 1 1 1
1 -1 1
NN 5 9
HOP
Example 2
• Separation of patterns using the two fundamental memories
(-1 -1 -1) ( 1 1 1):
• Find weights to obtain the following behavior:
-1 -1 -1
-1 -1 1 -1 1 -1 1 -1 -1
1
-1 1 1 1 -1 1 1 1 -1
1 1 1 2 3
wij = ?
NN 5 10
Neural Networks 5
NN 5
HOP
CONVERGENCE
• Every stable state x is at an “energy minimum”. (A
state x is stable if x(n+1)=x(n))
What is “energy”?
Energy is a function (Lyapunov function):
E: States →
such that every firing (change of its output value) of a
neuron decreases the value of E:
HOP
CONVERGENCE
• ∑w x x
i, j
will thus be large for a state x, which is a stored
ij i j
NN 5 12
Neural Networks 6
NN 5
HOP
Energy Decreases
Claim:
Firing a neuron decreases the energy E.
Proof:
Let l be the neuron that fires:
Either:
∑w
i =1
x >0
li i
(above threshold)
∑w
i =1
x <0
li i
(below threshold)
N
Thus in both cases: ( xl '− xl ) ∑ wli xi > 0
i =1
value after
NN 5 13
HOP
Proof
Neural Networks 7
NN 5
HOP
Proof
NN 5 15
HOP
Convergence result
NN 5 16
Neural Networks 8
NN 5
HOP
Computer experiment
NN 5 17
Computer experiment
NN 5 18
Neural Networks 9
NN 5
Computer experiment
NN 5 19
Computer experiment
NN 5 20
Neural Networks 10
NN 5
Computer experiment
NN 5 21
NN 5 22
Neural Networks 11
NN 5
NN 5 23
Storage Capacity
NN 5 24
Neural Networks 12
NN 5
M 1
C ≡ =
N 4 ln N
• That is, if β is the probability that the j-th bit of the m-th
fundamental memory is correctly retrieved for all j =1,..,N
and m = 1,…,M, then
NN 5 26
Neural Networks 13
NN 5
TSP
• Travelling Salesman Problem (TSP):
Given N cities with distances dij .
What is the shortest tour?
NN 5 27
Encoding
• Construct a Hopfield network with N2 nodes.
• Semantics: nia = 1 iff town i is visited at step a
• Constraints:
∑n
i
ia = 1, ∀a ∑n
a
ia = 1, ∀i
w ijab = d ij
NN 5 28
Neural Networks 14
NN 5
NN 5 29
Neural Networks 15