Sie sind auf Seite 1von 10

Pattern Classifiers

Hebb Nets
• A single -layer feedforward neural net trained through the Hebb
learning is known as a Hebb net.
• The hebb learning rule is the training procedure for a Hebb net where
a net is trained to implement the logical AND function.
Hebb Rule
• The Hebb rule is one of the earliest learning rules forANNs. According
to this rule the weight adjustment is computed as
• ∆wi = xi*t
• where t is the target activation
• There are certain points to be kept in mind regarding the Hebb
learning rule.
• First, Hebb rule cannot learn when the target is 0.
• This is because the weight adjustment ∆wi becomes zero when t=0,
irrespective of the value of xi .
Hebb Rule

• Hence, obviously ,the hebb rule results in better learning if the


input /output both are in bipolar form.
Limitation of Hebb learning:
• The most striking limitation of the Hebb rule is it does not guarantee
to learn a classification instance even if the classes are linearly
separable.
Procedure Hebb Learning
Step 1. Initialize all weights to 0.
wi =0 for all i=0 to m .
Step 2. For each training vector and target output pair ,
s:t, do steps 3-5.
Step 3. Assign the input vectors to the input layer.
x0 =1, and xi =si for all i=1 to m.
Step 4. Activate the output unit with the target output
y_out=t
Step 5. Adjust the weights
wi (new) = wi(old)+xi *y_out
Step 6. Stop
Example:Realizing the logical AND
function throughHebb learning.
• To realize a two input AND function we need a net with two input units
and one output unit.
• A bias is also needed.
• Hence the structure of the required neural net should be as shown in
following fig.
• Moreover, the input and output signals must be bipolar form, rather
than the binary form, so that the net may be train properly.
• Considering the truth table of AND operation, and the fact that the bias
is permanently set to 1, we get the training set depicted in Following
Table.
Structure of a neural net to realize the
AND function

1 X0 W0

X1 Y
x1 W1 y_out= x1^x2

x2 X2 W2
Table: Training set for AND function
Input Patterns Output
X0 X1 X2 t
+1 1 1 1
+1 1 -1 -1
+1 -1 1 -1
+1 -1 -1 -1
Example:Realizing the logical AND function
through Hebb learning.

• During the training process, all weights are initialized to 0.


• Therefore , initially W0=W1=W2=0
• At each training instance, the weights are changed according to the
formula
• W1(new)=Wi(old)+∆Wi
• Where ∆Wi, the increment in Wi , is computed as ∆Wi = xi*t.
• After initialization , the progress of the learning process by the
network is shown in table.
Hebbian learning of AND function
#

Das könnte Ihnen auch gefallen