Beruflich Dokumente
Kultur Dokumente
February 1,2019
Outline
1 Analysis of Frameworks
Motivation
Comparison of Frameworks
Analysis Based on time
2 Implementation Task
Implementation of a BPSK Detector
Goal
Problem
A large number of Deep learning frameworks available
Each stands out with its own purpose
Which suits my application ?
Frameworks
Frameworks
Overview:
TensorFlow
Creator: Google
Platform: Windows, Linux, macOS, Android
Interface Language: Python, C, C++, JavaScript, R,
Julia
Keras
Creator: François Chollet
Platform: Windows, Linux, macOS
Interface Language: Python,R
MATLAB
Creator: Mathworks
Platform: Windows, Linux, macOS
Interface Language: MATLAB
Frameworks
Overview:
Caffe
Creator: Berkeley Vision and Learning Center
Platform: Windows, Linux, macOS
Interface Language: Python, MATLAB, C++
PyTorch
Creator: Facebook AI Group
Platform: Windows, Linux, macOS
Interface Language: Python
Available Layers
Fully connected Network:
Layers: Input,Hidden,Output
Each neuron is connected to all the other neurons in
the next layer
Available Layers
Available Layers
Convolutional Neural Network:
Convolution operation is applied to the input data
Sliding Kernel is applied
Can be followed by a pooling operation
Available Layers
Available Layers
Convolution 2D:
Available Layers
Redefining Convolution:
Figure: Kernel
Available Layers
Available Layers
Deconvolution 2D:
Available Layers
Available Layers
Shortcomings:
Short dependencies
Vanishing gradient problem
Available Layers
LSTM
It interacts with subsequent layers through four different
entities, and the key cell state assures the long term
dependency.
Available Layers
Activation Function
Provide a non linear mapping between the input and
response variable.
Assure universal function approximation
They should be differentiable mathematical function.
Activation Function
Sigmoid:
+ Non linear and differentiable
+ Probabilistic output
- Saturates for x 0 and x 0
- Not zero-centered
22 / 49 Abdur Rahman, Mohamed Ismail –
Analysis of Different Frameworks for Deep Neural Network and its Implementation
1. Analysis of Frameworks Comparison of Frameworks
Activation Function
Tanh:
+ Non linear and Differentiable
+ Zero Centered
- Vanishing gradient problem
Activation Function
Relu:
+ Avoids vanishing gradient problem
- Not zero centered
Activation Function
Leaky Relu:
+ Fixes dying ReLU
Parametric ReLU: paramater ’a’
Activation Function
Exponential Relu
+ No vanishing gradient problem
+ Can keep zero mean activation
Need:
Though most frameworks come with number of
sophisticated predefined layers, there are instances when
application-specific layers either from scratch or as the
composition of existing layers is necessary.
GUI Availability
MATLAB- GUI
MATLAB- GUI
MATLAB- GUI
Implications of comparisons
Model setup:
A Fully connected neural network is considered
MNIST dataset with 6000 training images and 10,000
test images are taken as input
Varying the network “depth” (the number of internal
layers) with a fixed number of neurons in each layer
Varying the network “width” (the number of neurons)
with a fixed number of layers
Goal
To implement a BPSK detector considering an AWGN
channel
Framework
Framework used: Tensorflow with Keras
Motivation
Computation time
Available libraries
Detailed documentation
Figure: Training