Beruflich Dokumente
Kultur Dokumente
AIM:
To plot different activation functions used in Artificial Neural Networks.
SOFTWARE REQUIRED:
Jupyter Notebook. Libraries: MATPLOTLIB
THEORY:
Activation Function-
Activation functions are mathematical equations that determine the output of
a neural network. The function is attached to each neuron in the network,
and determines whether it should be activated (“fired”) or not, based on
whether each neuron’s input is relevant for the model’s prediction.
Activation functions also help normalize the output of each neuron to a
range between 1 and 0 or between -1 and 1.
An additional aspect of activation functions is that they must be
computationally efficient because they are calculated across thousands or
even millions of neurons for each data sample.
Modern neural networks use a technique called backpropagation to train the
model, which places an increased computational strain on the activation
function, and its derivative function.
The different activation functions to be plotted are:
1.Hardlimit
2.Symmetrical Hardlimit
3.Linear
4.Saturating linear
5.Symmetric Saturating Linear
6.Log Sigmoid
7.Hyperbolic Tangent Sigmoid
8.Relu
1. Hardlimit function:
The hard limit transfer function forces a neuron to output a 1 if its net input
reaches a threshold, otherwise it outputs 0. This allows a neuron to make a
decision or classification. It can say yes or no. This kind of neuron is often
trained with the perceptron learning rule.
Transfer functions calculate a layer's output from its net input.
hardlim(N) takes one input,
Equation:
a = 0 for n<0
a = 1 for n>=0
2. Symmetrical Hardlimit function:
The symmetric hard limit transfer function forces a neuron to output a 1 if its
net input reaches a threshold. Otherwise it outputs -1. Like the regular hard
limit function, this allows a neuron to make a decision or classification. It
can say yes or no.
hardlims is a transfer function. Transfer functions calculate a layer's output
from its net input.
hardlims(N) takes one input,
Equation:
a = -1 for n<0
a = 1 for n>=0
3. Linear function:
A linear activation function takes the form:
A = cx
It takes the inputs, multiplied by the weights for each neuron, and creates
an output signal proportional to the input. In one sense, a linear function is
better than a step function because it allows multiple outputs, not just yes
and no.
a = 0 for n<0
a = n for 0<=n<=1
a = 1 n>1
5. Symmetric saturating linear function:
It takes input as n and output as a.
Equation:
a = -1 for n<-1
a = n for -1<=n<=1
a = 1 n>1
Disadvantages
Disadvantages
Advantages:
Disadvantages:
8. ReLU:
#Relu
y=[]
for i in n:
if i<0:
y.append(i)
else:
y.append(0)
plt.plot(n,y,label="Relu")
plt.legend(loc="upper left")
plt.axhspan(0,0,linewidth=2,color='#000000')
plt.axvline(0,0,linewidth=2,color='#000000')
CONCLUSION: