Sie sind auf Seite 1von 47

ANN

for fitting applications


(data modeling)

1
 Scope:
❖ Build a cost effective model (system identification)
based on a set of data that trains a neural network

 Main phases:
1. ANN architecture selection and training
2. ANN simulation - using the developed model

2
❖ In fitting problems, one want a neural network to map
between a data set of numeric inputs and a set of
numeric outputs (targets).

❖ Training a neural network to perform a particular function


by adjusting the values of the connections (weights +
biases) between elements (PE).

❖ Typically, neural networks are adjusted (trained) so that


a particular input leads to a specific target.

3
Neural network training
 Supervised learning

4
Fitting (Modeling) a Simple Function
y = sin(2*x)./exp(x/5);

0.8

0.6

0.4

0.2
y

-0.2

-0.4

-0.6

-0.8
0 1 2 3 4 5 6 7 8 9 10
x
5 / 19
Problem formulation
x - inputs
y - targets (outputs)
101 data pairs

inputs: matrix 1 x 101 samples


targets: matrix 1 x 101 samples

inputs: … 3.7 3.8 3.9 4.0 4.1 4.2 …


targets: … 0.4288 0.4527 0.4577 0.4445 0.4143 0.3689…
ANN arhitecture

A two-layer feed-forward network with sigmoid hidden


neurons and linear output neurons, can fit multi-
dimensional mapping problems arbitrarily well, given
consistent data and enough neurons in its hidden layer.
ANN training
>> nnstart
13 neurons in the hidden layer
Errors evolution during the training process
Best Validation Performance is 5.9723e-08 at epoch 407
0
10
Train
Validation
Test
-2
Best
Mean Squared Error (mse)

10

-4
10

-6
10

-8
10

0 50 100 150 200 250 300 350 400


407 Epochs
Instances

0
5
10
15
20
25
-0.00079
-0.00074
-0.00068
-0.00063
-0.00058
-0.00052
-0.00047
Test

-0.00042
Training
Validation

Zero Error

-0.00037
-0.00031
-0.00026
-0.00021
-0.00015
-0.0001
Errors = Targets - Outputs -4.8e-05
Error Histogram with 20 Bins

4.98e-06
5.79e-05
0.000111
0.000164
0.000217
Function Fit for Output Element 1
1 Training Targets
Training Outputs
0.8 Validation Targets
Validation Outputs
Test Targets
0.6
Test Outputs
Errors
0.4
Output and Target

Fit

0.2

-0.2

-0.4

-0.6

-0.8
0x 10-4 1 2 3 4 5 6 7 8 9 10
5

0
Error

-5 Targets - Outputs

-10
Input
Training: R=1 Validation: R=1

Output ~= 1*Target + -6.4e-09

Output ~= 1*Target + 8.3e-05


0.8 0.8

Linear offset 0.6


Data
Fit
Y=T
0.6
Data
Fit
Y=T
0.4 0.4
regression 0.2 0.2

0 0
slope -0.2 -0.2

-0.4 -0.4

-0.6 -0.6
-0.5 0 0.5 -0.5 0 0.5
Target Target

Test: R=1 All: R=1

Output ~= 1*Target + 4.9e-06

Output ~= 1*Target + 1.6e-05


0.8 0.8
Data Data
0.6 Fit 0.6 Fit
0.4 Y=T 0.4 Y=T

0.2 0.2

0 0

-0.2 -0.2

-0.4 -0.4

-0.6 -0.6
-0.5 0 0.5 -0.5 0 0.5
Target Target

The linear regression offers information on two extremes.


➢ it provides a global appreciation of the accuracy (through the regression value R and through the slope
and offset.
➢ it compares the position of each generated data point with its target counterpart.
❖ the global appreciation can “hide” some errors, while the detailed plot of each point can lead to a very
difficult to analyze image.
>> help nnet
Model utilization in Simulink
5 neurons
in the
hidden
layer
5 neurons
in the
hidden
layer
5 neurons
in the
hidden
layer
• Retrained
5 neurons
in the
hidden
layer
• Retrained
Example

Prove that the total number of training parameters is given by the


relation:
N p  H N  1  K H  1
The number of inputs and output are set by the problem to be solved. The user can choose
only the number of neurons in the hidden layer
Fitting (Modeling) a MultivariableFunction
Design parameters Range
(input variables)
IB [μA] 25 – 65
W12 [μm] 0,6 – 3,5
W34 [μm] 0,85 - 6,5
W56 [μm] 5 - 90

Functions to be modeled
(fitted)

• Avo
• GBW
Data structure Data set length: 800
Data samples
>> nftool
function net = create_fit_net(inputs,targets)
%CREATE_FIT_NET Creates and trains a fitting neural network.
% NET = CREATE_FIT_NET(INPUTS,TARGETS) takes these arguments:
% INPUTS - RxQ matrix of Q R-element input samples
% TARGETS - SxQ matrix of Q S-element associated target samples
% arranged as columns, and returns these results:
% NET - The trained neural network
% For example, to solve the Simple Fit dataset problem with this function:
% load simplefit_dataset
% net = create_fit_net(simplefitInputs,simplefitTargets);
% simplefitOutputs = sim(net,simplefitInputs);
% To reproduce the results you obtained in NFTOOL:
% net = create_fit_net(inputs,avo_target);

% Create Network
numHiddenNeurons = 20; % Adjust as desired
net = newfit(inputs,targets,numHiddenNeurons);
net.divideParam.trainRatio = 70/100; % Adjust as desired
net.divideParam.valRatio = 15/100; % Adjust as desired
net.divideParam.testRatio = 15/100; % Adjust as desired

% Train and Apply Network


[net,tr] = train(net,inputs,targets);
outputs = sim(net,inputs);

% Plot
plotperf(tr) Matlab function
plotfit(net,inputs,targets)
plotregression(targets,outputs)
Simulink model

Neural Network
Avo_sota
Layer 1
weight

weight
Application: ANN based model for
Avo(IB,W12,W34,W56) for SOTA
Design parameters Range
(input variables)
IB [μA] 25 – 65
W12 [μm] 0,6 – 3,5
W34 [μm] 0,85 - 6,5
W56 [μm] 5 - 90

Data set:

input data:

output data:
For ANN generation use GenTrainANN.m

For ANN simulation (as a model) use SimANN.m

Das könnte Ihnen auch gefallen