Sie sind auf Seite 1von 26

NEURAL

NETWORKS
CONTENTS

Neural Network

NEURAL Artificial Neural Network


Network Topology

NETWOR
Weights
Activation Function

KS
Propagation Rule

RADHWANA KAMAL Deep Learning


FREEDOM PHILIP LACHICA
RAZYL MAE PINO
DANIEL REYES
M I K H A E L A TA G A L O G
NEURAL NETWORK 1

A neural network is
a general
mathematical Lateral
Occipital
computing Complex

paradigm that
Primary
models the Lateral
Geniculate
Visual
Cortex
Secondary Tertiary
Visual Cortex
operations of
Visual
Nucleus Cortex

biological neural
systems.
Hu, Y.H., Hwang, J.N. (2002).
Handbook of Neural Network Signal
Processing. Boca Raton, Florida: CRC
Press LLC.
NEURAL NETWORK 2

A neural network is a
dynamical system with
one-way
interconnections. It
carries out processing
by its response to
Nod - processing elements.
inputs.
e irected
D - interconnects.
Links
Harvey, R.L. (1994). Neural
Network Principles. Englewood
Cliffs, New Jersey: Prentice-Hall,
Inc.
NEURAL NETWORK 3

S C H E M AT I C
DIAGRAM OF A
NEURAL
NETWORK
Each processing
element (neuron) has
many signal inputs and
a signal output that
branches into copies.
Harvey, R.L. (1994). Neural
Network Principles. Englewood
Cliffs, New Jersey: Prentice-Hall,
Inc.
NEURAL NETWORK 4

BRANCHES OF
NEURAL
NETWORK
THEORY

1.
Perceptron
2. Associative
Memory
3. Biological Model

Harvey, R.L. (1994). Neural


Network Principles. Englewood
Cliffs, New Jersey: Prentice-Hall,
Inc.
ARTIFICIAL NEURAL NETWORK 5

An artificial neural
network is an
implied model of the
biological neuron to
make decisions and
conclusions by
simulating human
brain’s work.

Bryant, D.A., Frigaard, N.U. (2006).


Training Artificial Neural Networks
Using APPM.
BASIC NEURAL NETWORK 6

Neuron- COMPONENTS
common building block of a neural McCulloch and Pitts’ Neuron Model
network.
- consists of two parts:

1. Net
Function
2. Activation
Net Function
- determines how the network
Function inputs { yj ; 1 ≤ j ≤ N } are
combined inside the neuron.
A weighted linear combination is adopted:

Network Synaptic Bias or Network


Hu, Y.H., Hwang, J.N. (2002).
Handbook of Neural Network Signal
Inputs Weights Threshold Output
Processing. Boca Raton, Florida: CRC
BASIC NEURAL NETWORK 7

COMPONENTS McCulloch and Pitts’ Neuron Model

Activation
Function
- linear or nonlinear
transformation that relates the
network inputs ui to the
network output ai .

Network Synaptic Bias or Network


Hu, Y.H., Hwang, J.N. (2002).
Handbook of Neural Network Signal
Inputs Weights Threshold Output
Processing. Boca Raton, Florida: CRC
BASIC NEURAL NETWORK 8

COMPONENTS
Summary of Net Functions

Hu, Y.H., Hwang, J.N. (2002). Handbook of Neural Network Signal Processing. Boca Raton, Florida:
CRC Press LLC.
BASIC NEURAL NETWORK 9

COMPONENTS
Neuron Activation Functions

Hu, Y.H., Hwang, J.N. (2002). Handbook of Neural Network Signal Processing. Boca Raton, Florida: CRC
Press LLC.
NEURAL NETWORK TOPOLOGY 1
0

ACYCLIC NEURAL Neural


- consists of noNETWORK
feedback loops. - multiple neurons are interconnected
Network
to form a network to facilitate
- often used to approximate a nonlinear distributed computing.
mapping between its inputs and outputs. Directed Graph
- consists of nodes and directed arcs.
CYCLIC NEURAL NETWORK
NEURON SYNAPTI
- contains at least one cycle formed by S C
directed arcs; also known as a recurrent - topology can be: LINKS
network.
- leads to a nonlinear dynamic system
model that contains internal memory. 1. Cyclic NEURAL
NETWORK
Hu, Y.H., Hwang, J.N. (2002). Handbook of Neural
2. Acyclic TO P O LO G Y
Network Signal Processing. Boca Raton, Florida: CRC
Press LLC.
NEURAL NETWORK TOPOLOGY 1
1

ACYCLIC NEURAL CYCLIC NEURAL


NETWORK NETWORK
Hu, Y.H., Hwang, J.N. (2002). Handbook of Neural Network Signal Processing. Boca Raton, Florida:
CRC Press LLC.
SYNAPTIC WEIGHTS 1
2

Weight
- strength
characterizing each
connecting link in a
neural network.
Liu, G.P. (2001). Nonlinear Identification
and Control: A Neural Network
Synaptic Synaptic
Approach. Nottingham, England: Weights Weights
Springer-Verlag London.
NEURAL NETWORK OPERATION 1
3

A neuron is the basic unit of a


neural network. It gets certain
number of inputs and a bias value.
When a signal(value) arrives, it gets
multiplied by a weight value.

E.g. If a neuron has 4 inputs, it has


4 weight values which can be
adjusted during training time.

Ahirwar, K. (2017, November 1).


Everything You Need To Know About
Neural Networks. Retrieved from
hackernoon.com
NEURAL NETWORK OPERATION 1
4

A link connects one


neuron in one layer to
another neuron in
other layer or the
same layer. A link
always has a weight
value associated with
it. The goal of the
training is to update
this weight value to
decrease the loss
Ahirwar, K. (2017, November 1).
Everything You Need To Know About (error).
Neural Networks. Retrieved from
hackernoon.com
NEURAL NETWORK OPERATION 1
5

A bias is an extra input to


neurons and it is always
1, and has it’s own
synaptic weight. This
makes sure that even
when all the inputs are
none (all 0’s), there is
going to be an activation
in the neuron.
Ahirwar, K. (2017, November 1).
Everything You Need To Know About
Neural Networks. Retrieved from
hackernoon.com
NEURAL NETWORK OPERATION 1
6

An activation function is
used to introduce non-
linearity to a neural network.
It squashes the values in a
smaller range.
E.g. A Sigmoid activation
function squashes values
between a range 0 to 1.

Ahirwar, K. (2017, November 1).


Everything You Need To Know About
Neural Networks. Retrieved from
hackernoon.com
NEURAL NETWORK OPERATION 1
7

E.g. If the weight from node 1 to


A weight represent the strength A weight decides how
node 2 has greater magnitude, it
of the connection between means that neuron 1 has greater
much influence the input
units. influence over neuron 2. will have on the output.
A weight brings down the
importance of the input
value.
Weights near zero means
changing this input will not
change the output.

Negative weights mean


increasing this input will
decrease the output.

Ahirwar, K. (2017, November 1).


Everything You Need To Know About
Neural Networks. Retrieved from
hackernoon.com
PROPAGATION RULE 1
8

Forward propagation is a
process of feeding input
values to the neural network
and getting an output which
we call predicted value.

Sometimes, we refer
forward propagation as
inference.
FORWARD Ahirwar, K. (2017, November 1).
Everything You Need To Know About
Neural Networks. Retrieved from
PROPAGATION hackernoon.com
PROPAGATION RULE 1
9

Back-propagation uses chain rule of


differential calculus. Derivatives of error
value with respect to the weight values of
the last layer. We call these
derivatives, gradients and use these gradient
values to calculate the gradients of the
second last layer. We repeat this process
until we get gradients for each and every
weight in our neural network. Then we
subtract this gradient value from the weight
value to reduce the error value. In this way
we move closer (descent) to the local
minima
Ahirwar, K. (means minimum
(2017, November 1). loss).
Everything You Need To Know About
Neural Networks. Retrieved from BACK-PROPAGATION
hackernoon.com
DEEP LEARNING 2
2

DEEP
LEARNING
- an artificial
intelligence (AI) function
that mimics the
workings of the human
brain in processing data
for use in decision
making.

Hargrave, M. (2019, April 30). Deep Learning. Retrieved from investopedia.com


DEEP LEARNING 2
1

Deep
-Learning
a subset of machine learning.

- utilizes a hierarchical level of artificial neural


networks to carry out the process of machine
learning.
- The artificial neural networks are built like the
human brain, with neuron nodes connected
together like a web.
- while traditional programs build analysis with
data in a linear way, the hierarchical function of
deep learning systems enables machines to
process data with a nonlinear approach.

Hargrave, M. (2019, April 30). Deep


Learning. Retrieved from
investopedia.com
DEEP LEARNING 2
3

DEEP
LEARNING
- is able to learn from
data that is both
unstructured and
unlabeled.

Hargrave, M. (2019, April 30). Deep Learning. Retrieved from investopedia.c


DEEP LEARNING 2
3

Deep learning is used across


all industries for a number of
different tasks.
Commercial apps that
use:
1. Image
Recognition
2. Open source platforms
with consumer
recommendation
3. Medical research apps
tools that
explore the possibility of
reusing drugs for new
ailments
are a few of the examples of deep learning incorporation.

Hargrave, M. (2019, April 30). Deep Learning. Retrieved from investopedi

Das könnte Ihnen auch gefallen