Sie sind auf Seite 1von 9

Da Nang University of Technology

Course: CSE446

LABORATORY 1
Application of NN for Fish Behavior Understanding

Class:13ECE
Name of students:
Nguyen Le Huan (13ECE2)
Nguyen Anh Nguyen (13ECE1)
Ngo Thanh Truong Giang (13ECE1)

Da Nang
September 24st, 2017
1. Project Objective:
The objectives of this lab are the following:
To implement supervised learning in the form of neural networks and error
backpropagation.
The aim of this lab is to done completely in Matlab starting from zero.
For this purpose, creating network is built and tested in the application of Fish
Behavior Understanding.

2. Project Scope:
a. To meet project requirements, you will need to:
b. Get used with Neural Network Toolbox included in Matlab.
c. Setup Neural Network to learn specific learning problem.
d. Get used with confusion matrix as evaluation method.
e. Analyzed the performance of the network since changing the configuration
parameter.
3. Get Matrix

File = dir;
%File = File(~cellfun('isempty', {File.date}));
inputFile = [];
outputFile = [];
for i = 3 : 72
fileName = File(i).name;
load(fileName);
input = statistic';
inputFile = [inputFile input];
if (dodoc == 0)
output = [-1 -1]';
else
output = [1 1]';
end

outputFile = [outputFile output];


end

First of all, we should input file for training. We use the data in the folder Train, put
the same location on the computer. We using loop to get file.
4. Train
inputs = inputFile;
targets = outputFile;

hiddenLayerSize = [7 3];

net = fitnet(hiddenLayerSize,'traincgb');
net.trainParam.epochs=4000;
net.trainParam.goal=1e-15;
net.trainParam.show = 10;
net.divideParam.trainRatio = 80/100;
net.divideParam.valRatio = 0/100;
net.divideParam.testRatio = 20/100;

[net,tr] = train(net,inputs,targets);
view(net)

After input training file, we should create a hidden layer size, prepare for NN.
- Input: 9 properties.
- Output: 2 properties corresponds to clean water and polluted water.
-

We use Conjugate gradient backpropagation with Polak-Ribire (cgb).

Set up the division of data:


- 80% used for training.
-20% used for testing.
error=15;
while (error > 5)
net = fitnet(hiddenLayerSize,'traincgb');
net.trainParam.epochs=4000;
net.trainParam.goal=1e-15;
net.trainParam.show = 10;
net.divideParam.trainRatio = 80/100;
net.divideParam.valRatio = 0/100;
net.divideParam.testRatio = 20/100;
error = 0;
[net,tr] = train(net,inputs,targets);
myFolder = dir;
myFolder = myFolder(~cellfun('isempty', {myFolder.date}));
outputCompare = [];
outputTestFile = [];

for i = 3 : 22
file = myFolder(i).name;
load(file);
inputTest = statistic';
if (dodoc == 0)
output = [-1 -1]';
else
output = [1 1]';
end

outputCompare = [outputCompare output];


outputTest = net(inputTest);
outputTest = hardlims(outputTest);
outputTestFile = [outputTestFile outputTest]
end
for j = 1 : 20
if(outputTestFile(1,j) ~= outputCompare(1,j))
error = error + 1;
end
end
error
end

From the idea train again if number of error unsatisfactory conditions, we using a loop until
number of error less or equal 4.
5. Result

After several time training, we get result which is 4 errors.

The accuracy (AC) is the proportion of the total number of predictions that were correct.
AC=75%
The recall or true positive rate (TP) is the proportion of positive cases that were correctly identified
TP =86.6%
The false positive rate (FP) is the proportion of negatives cases that were incorrectly classified as positive
FP=60%
The true negative rate (TN) is defined as the proportion of negatives cases that were classified correctly
TN=40%
The false negative rate (FN) is the proportion of positives cases that were incorrectly classified as negative
FN=13.4%
Finally, precision (P) is the proportion of the predicted positive cases that were correct
P=18.75%
In the epoch 767, we get best resul. Through each training, the performance increases.
Until we run the testing with 5 states but k = 100 and get a worse precision
and accuracy:
Recall = 1.000000; Precision = 1.000000; Accuracy = 1.000000

1.2

K-mean algorithm
1

0.8
Propability

0.6 Recall
Precision
0.4 Accuracy

0.2

0
0 20 40 60 80 100 120 140
Number of Observation

Then we try to run with other cases and get the results:

Numbers of
State obversation Recall Precision Accuracy
2 100 1 0.923077 0.96296
3 100 1 0.857143 0.925926
4 100 1 0.8 0.888889
5 100 1 1 1
6 100 1 0.925926 0.96296
7 100 1 0.75 0.851852
8 100 1 0.75 0.851852
9 100 1 1 1
10 100 0 0 0.444444
1.2

0.8
Propobility

0.6 Series1

Series2

0.4 Series3

0.2

0
0 2 4 6 8 10 12
Number of state

We found out that we can improve the precision if we take k = 100. Although with
9 states we get the same results but it takes time to run algorithm. Therefore,
experimental results showed that the 5-state left-to-right hidden Markov model
provided the highest performance.

Das könnte Ihnen auch gefallen