Beruflich Dokumente
Kultur Dokumente
Analysis of Influences of memory on Cognitive load Using Neural Network Back Propagation
Algorithm
A.Naveen1, M.S.Josephine2,
1
Research Scholar, Department of Computer Applications, Dr. M.G.R. Educational and Research
Institute University, Chennai
Professor, Department of Computer Applications, Dr. M.G.R. Educational and Research Institute
University, Chennai
E-mail:josejbr@yahoo.com, naveenking@yahoo.co.in
ABSTRACT
Educational mining used to evaluate the leaner's performance and the learning environment. The
learning process are involved and influenced by different components. The memory is playing vital
role in the process of learning. The long term, short term, working, instant, responsive, process,
recollect, reference, instruction and action memory are involved in the process of learning. The
influencing factors on these memories are identified through the construction analysis of Neural
Network Back Propagation Algorithm. The observed set of data represented using cubical dataset
format for the mining approach. The mining process is carried out using neural network based back
propagation network model to decide the influencing cognitive load for the different learning
challenges. The learners difficulties are identified through the experimental results.
I.INTRODUCTION
S. No.
Year
Author
Work
2000
Ma, Y.,
Liu, B.,
Wong,
C. K.,
Yu, P.
S., Lee,
S. M
Presented a real
life application of
data mining to
find
weak
students
2001
Luan J.
Introduced
a
powerful decision
support tool, data
mining, in the
context
of
knowledge
management
II.REVIEW OF LITERATURE
2002
Luan J.
Discussed
the
potential
applications
of
data mining in
higher education
& explained how
data mining saves
resources while
maximizing
efficiency
in
academics.
2005
Delava
ri et al
Proposed a model
for the application
336
of data mining in
higher education.
IV.NEURAL
NETWORK
PROPAGATION MODEL
2006
Shyam
ala, K.
&Raja
gopala
n, S. P.
Developed
a
model to find
similar patterns
from the data
gathered and to
make predication
about
students
performance.
2006
Sargent
i et al
Explored
the
development of a
model
which
allows
for
diffusion
of
knowledge within
a small business
university.
BACK
NEURAL NETWORK
Artificial neural networks simulate the
biological structure of neural networks in
brains in a simplified way to endow computers
with the very considerable abilities of
biological neural networks: the ability of
learning from examples, pattern classification,
generalization and prediction.
BACK PROPAGATION ALGORITHM
As per the adopted back propagation neural
network model, the observed data is mapped
and the model is constructed. The adopted
algorithm concept converted as a step by step
executable concept and presented below
Step 1 :
Normalize the I/P and O/P with
respect to their maximum values. For
each training pair, assume that in
normalized form there are inputs
given by { I } I and x 1.n outputs
given by { O }o
nx1
Step 2 :
Assume that the number of neurons in
the hidden layers lie between
1
< m < 10. because the ten memory
attributes are consider for this network
construction.
Step 3:
Let [ V ] represents the weights of
synapses connecting input neuron and
hidden neuron. Let [ W ] represents
the weights of synapses connecting
hidden
neuron
and
output
neuron.Initialize the weights to small
random values usually from -1 to +1;
[ V]0 = [ramdom weights ]
337
[ W ]0 = [ randaom weights]
0
[ V ]
= [ W ]
= [0]
Step 11 :
Find [ Y ] matrix as
{ O } I = { I }I
tx1
{ d} =
[ Y ] = { O }H {d}
t x1
mxn
Step 5 :
Compute the inputs to the hidden
layers by multiplying corresponding
weights of synapses as
{ I }H= [ V ]T
m x1
mx
Step 12 :
[ W ]t + 1 = [ W ] t +
Find
[Y]
m xn
{ O }I
mx1 1xn
m xn
m x n
Step 13 :
x1
Find
Step 6 :
Let the hidden layer units, evaluate the
output using the sigmoidal function as
1
{e}=[W] {d}
mx1
m xn nx1
(OHi) ( 1- O Hi)
{d*}=
{O} =
m x 1
( 1 + e (IHi) )
ei
m x 1
Find
m x 1
[ X ] matrix as
[ X ] = { O }I {d* }
= { I }I { d *}
Step 7 :
Compute the inputs to the output
layers by multiplying corresponding
weights of synapses as
{ I }o = [ W ]T
n x1
1 x m
x1
1xm
x1 1 xm
Step 14 :
{ O }H
Find
X]
n x mm x 1
Step 8 :
1 xm
[ V ]t + 1 = [ V ] t + [
1 xm
1 x m
( 1 + e (Ioj) )
Step 15 :
Find
[ V ]t + 1 = [V ] t + [ V ]t + 1
[ W ]t + 1 = [ W ] t + [ W
t+1
]
Note : This output is the network
output
Step 9 :
Step 16 :
Error rate =
338
0.7452
0.6439
0.4698
0.2143
0.5475
0.9549
0.5177
0.4923
0.1257
0.0792
0.3202
0.1030
0.4678
0.6844
0.4638
0.8698
0.8769
0.1469
0.6540
0.4718
0.0451
0.0943
VI RESULT ANALYSIS
0.1912
0.4462
0.5433
0.1154
0.6347
0.8885
0.8214
0.0804
339
1.4905
1.2879
0.9397
0.4286
1.0950
1.9099
1.0355
0.9846
0.2514
0.1583
0.6403
0.2060
0.9356
1.3687
0.9276
1.7397
1.7538
0.2938
1.3080
0.4718
0.0451
0.0943
0.1912
0.4462
0.5433
0.1154
0.6347
0.8885
0.8214
0.0804
340
341