Beruflich Dokumente
Kultur Dokumente
1
Convolutional Neural Network
Classification
Learning Features
1998
Focus on end to end learning!
2
The Operations in Detail
Activation function
3
Spatial / Dimensional
structures are
preserved
4
5
If we use 6 such filters:
Neurons
Convnet arranged in
3D grids.
6
ReLU – Rectified Linear Unit (The activation function)
Pool – Sub-sampling like operation
7
Strides and Zero Padding
8
Pooling:
• Makes representations
“manageable”
• Introduces 0 parameters
• No zero padding
Max-pooling:
9
Training the CNN: More practical approaches
10
Relook & more: Activation functions
Logistic
11
12
13
14
PReLU training
15
MAXout
Max
16
Advice from CNN gurus
Reminder: Preprocessing
17
More on: Weights initialization
Optimal Initialization?
Number of inputs
HOW
Allow these
parameters to
19 be learned
More on: Regularization
From optimization literature
Regularizing term
cost & amount
Derivative w.r.to
𝑓𝑓 of weights learning parameter
should exist
Tikhonov regularization
LS [linear]
Smoothing topology,
20 removing local minima
For CNN: