Sie sind auf Seite 1von 20

Artificial Neural networks

Artificial neural network is an information processing system that has certain performance characteristics in common with biological neural networks. They have been developed as generalizations of mathematical models of human neural biology.

Neural Networks Applications


Face recognition Time series prediction Process identification Process control Optical character recognition Adaptive filtering Classification Data association Artificial Neural networks tries to emulate the biological neurons in the brain.

Biological Neuron

synapse synapse nucleus nucleus cell body cell body

axon axon

The main parts of thedendrites neurons are biological dendrites

1. 2. 3. 4. 5.

Cell body (soma) Nucleus Dendrites Synapse Axon

Synapse receives incoming signals, change electrical (ionic) potential of cell body. When potential of cell body reaches some limit, neuron fires, electrical signal (action potential) sent down axon. Axon propagates signal to other neurons, downstream ---------------x-------------------

Artificial Neural Network


In creating artificial neural network following assumption s are made 1.Information processing occurs at many simple elements called neurons 2. Signals are passed between neurons over common links 3. Each connection link has an associated weight which multiplies the signal transmitted. 4. Each neuron applies an activation function to its net input (Sum of weighted input signals) to determine its output.

Artificial Neuron
x1 x2 w1 w2 net
Activation Function

f(net)

xn

wn

The basic operation of an artificial neuron involves summing up of its weighted inputs and applying an activation function . The output depends on the type of activation function net is also sometimes termed as y-in

-----------------x---------------------

Different Activation functions


Identity function. Binary step function.

Bipolar step function. Sigmoidal function. Binary sigmoidal function. Bipolar sigmoidal function. Ramp function.

Identity function

f(x) x f(x) = x

Binary step function without threshold


1 f(x) x f(x) = 1 if x 0 0 if x < 0

Bipolar step function with threshold


f(x) 1

-1 f(x) = 1 if x -1 if x < - where is the threshold

Bipolar step function without threshold


1

-1 f(x) = 1 if x 0 -1 if x < 0

Binary sigmoidal function


1 =3 0.5 =1 is steepness parameter

f(x) =

Bipolar sigmoidal function


1

-1 f(x) =

---------------x---------------Learning or training of NN

Learning or training of NN means setting the values of the weights


and bias to achieve a specified task There are two types of learning 1. Supervised learning 2. Unsupervised learning

Supervised learning
The neural network is presented with a sequence of training vectors or patterns. For each pattern an associated target output vector is provided. (The desired target). When a pattern similar to but not identical is presented NN produces an output . This actual output is compared with the target and an error signal is generated. The weights are the adjusted to obtain a new set of weights according to a

training algorithm which uses the error signal

Supervised learning block diagram


Input Vector
Neural Network weights

Target Vector + +

error vector
Training Algorithm

Unsupervised learning
A sequence of input training vectors are provided to the NN. But no target will be specified. The net modifies the weights according to some training algorithm such that most similar input vectors are assigned to the same output

Block Diagram of unsupervised learning


In put vector
Neural Net work weights

output vector

Training Algorithm

---------------x------------------

Different learning rules


Hebb Rule Perceptron learning Widrow Hoff Rule Generalized Delta learning rule or Back propagation rule 1.Hebb Rule Weight updation Wi (new) = Wi (old) + wi

wi = Xiy Where Wi = Weight vector Yi = Output Vector Xi= Input Vector Bias Updation b(new) = b(old)+b b = y 2. Perceptron learning Weight updation Wi (new) = Wi (old) + Wi Wi = t xi Where t is the target and is the learning rate Bias Updation b (new) = b (old) + b b = t

3. Widrow Hoff Rule

Weight updation Wi (new) = Wi (old) + Wi

Wi = ( t - y_in ) xi Bias Updation b (new) = b (old) + b b = ( t - y_in )

4. Generalized Delta learning rule (or )Back propagation rule Weigths for ouput units Wjk(new) = Wjk (old)+ k Zj Where is the learning rate and k is the error factor Bias for output units Wok (new) = Wok (old) + k Weigths for hidden units Vij(new) = Vij (old)+ j Xi Bias for hidden units Voj (new) = Voj (old) + j -----------------------NN Architecture The arrangement of neurons into layers and the connection pattern within and between layers is called the net architecture .In determining the number of layers the input units are not counted as a layer Types of architecture

1. Single layer net 2. Multi layer net. Single layer net A single layer net consists of 1. Input units which receive signals from outside 2. Output units from which the response of the net can be read 3. One layer of connection weights between the input and output units

Single layer net diagram


x1
w1m wi1 w1j w11

y1

xi
wn1

wij wim

yj

wnj
wnm

xn

ym

The input units are fully connected to output units but are not connected to other input units. ie , there is no interconnection among the input units. Similarly Out put units are not connected to other output units

Multi layer NN A multi layer net is one with one or more layers ( or levels) of nodes , known as hidden units or layers between the input units and output units. There is a layer of weights between two adjacent levels of units. There is no interconnection among the units of same layer

Multi layer NN diagram


x1
u11 u1j u1p ui1

z1

y1

xi

zj

yk

xn

zp

ym

----------------x-----------------

Different Artificial Neuron Models

Mc culloch Pitts Neuron


x1
w y -p xn+1 Xn+m -p

Xn

Architecture description x1, x2xn ,xn+1,xn+m are inputs McCulloch Pitt neurons are connected by directed ,weighted paths. A connection path is excitatory if the weight on the path is positive, otherwise it is inhibitory. All excitatory connections have same Weights This model uses a fixed threshold .

Selection of should satisfy the inequality > nw-p If the neuron has only excitatory inputs and no inhibitory inputs then kw (k-1)w where k is the number of excitatory inputs ---------------x-----------------

NN with bias
1 x1 x2 xn
b

w1

w2 wn

A bias acts exactly as a weight on a connection from a unit whose activation is always 1. Increasing the bias increases the net input to the unit. If a bias is included , the activation function is typically taken to be f (net) = 1 if net 0 = -1 if net < 0

threshold is used.

Hebb Net
1 x1 x2 xn
b

w1

w2 wn

Hebb Net Algorithm Step 1- Initialize all weights and bias to 0 Step2- Set the out put y = t Step3- Adjust the weights Wi (new) = Wi (old) + wi wi = Xiy Step4- Adjust the bias b(new)=b(old)+b b = y

The activation function used is binary step without a threshold if the target is binary . For bipolar targets bipolar step without threshold is used.

The Perceptron
1 x1 x2 xn
b

w1

w2 wn

Perceptron uses an iterative learning procedure to converge to correct weights .ie the weights that allow the net to produce correct output value for each of the training patterns. For each training input the net would calculate the response of the output unit Then the net will determine whether an error occurred for this pattern. If an error is occurred for a particular pattern the weights and bias will be changed according to perceptron learning rule. Training will continue till no error occurred.

The Perceptron Algorithm Step 1- Initialize weights and biases. Generally to zeros Step 2- Set learning rate (0 < 1 ) Step 3- Apply the inputs and calculate the net input

Step 4 - Calculate the output = 1 if y = 0 if = -1 if y_in > - y_in y_in < - y= t , no

Step 5 compare the output y with the target t if need for updation if y t , weights and bias are to be updated

Step - 6 Wi (new) = Wi (old) + Wi Wi = t xi b (new) = b (old) + b b = t

Step 7 Test stopping condition ie , no weight change

Das könnte Ihnen auch gefallen