Sie sind auf Seite 1von 22
Last updated: 14 April 2010 Introduction to Artificial Neural Network - theory, application and practice

Last updated: 14 April 2010

Introduction to Artificial Neural Network

- theory, application and practice using WEKA-

Network - theory, application and practice using WEKA- Anto Satriyo Nugroho, Dr.Eng Center for Information &
Network - theory, application and practice using WEKA- Anto Satriyo Nugroho, Dr.Eng Center for Information &

Anto Satriyo Nugroho, Dr.Eng

Center for Information & Communication Technology,

Agency for the Assessment & Application of Technology (PTIK-BPPT)

Email: asnugroho@gmail.com

URL: http://asnugroho.net

Example of BP calculation

Sepal

Length

Sepal

Width

Petal

Length

Petal

Width

Class

5.1

 

3.5

1.4

 

0.2

Iris-setosa

7.0

 

3.2

4.7

 

1.4

Iris-

   

versicolor

   

2-class problem 4 attributes

       

ANN Architecture

Sepal-length

Sepal-width

Petal-length

Petal-width

Sepal-length Sepal-width Petal-length Petal-width η = 0 . 1 (learning rate) Iris-setosa

η = 0. 1 (learning rate)

Petal-width η = 0 . 1 (learning rate) Iris-setosa Iris-versicolor Suppose the hidden neurons is 3

Iris-setosaPetal-length Petal-width η = 0 . 1 (learning rate) Iris-versicolor Suppose the hidden neurons is 3

Iris-versicolorSepal-width Petal-length Petal-width η = 0 . 1 (learning rate) Iris-setosa Suppose the hidden neurons is

Suppose the hidden neurons is 3

Sepal-length

Sepal-width

Petal-length

Petal-width

Initialization

w j i , w 0.1 k , j 0.1 0.2 0.1 0.1 0.3 Iris-setosa
w
j i
,
w
0.1
k
,
j
0.1
0.2
0.1
0.1
0.3
Iris-setosa
0.4
0.2
0.2
0.5
0.3
0.6
0.2
0.4
0.7
0.8
Iris-versicolor
0.3 0.5
0.6
0.9
k
1.0
1.1
1.2
j

i

Forward Pass

Calculating the output of Input Layer

5.1

3.5

1.4

0.2

I = x = 5 . 1 1 1 I = x = 3.5 2
I = x = 5 . 1
1
1
I = x = 3.5
2
2
I = x = 1.4
3
3
I
= x = 0.2
4
4

Iris-setosa

Iris-versicolor

Forward Pass

Calculating the output of Hidden Layer

net

1

=

=

=

0.1

+

(

×

0. 1 5 .1

)(

+

H

1

=

f ( net

1

)

=

w

i

θ +

1

I

0.7 1. 4

1, i

i

)(

+

×

0.4 3.5

×

3 .19

1

1 + e

3.

19

= 0 . 96

)(

+

1.0 0.2

×

)

Forward Pass

net

2

=

=

=

0.2

+

(

×

0.2 5.1

)(

+

H

2

=

f ( net

2

)

θ +

2

w

2 , i

I

i

i

0.5 3.5

×

)(

+

0.8 1.4

×

4.31

=

1

1 + e

4.

31

= 0.99

)(

+

1 .1 0.2

×

)

Forward Pass

net

3

=

=

=

0.3

+

(

×

0.3 5.1

)(

+

H

3

=

f ( net

3

)

=

w

i

θ +

3

I

0.9 1.4

×

3 , i

i

)(

+

0.6 3 .5

×

5 .43

1

1 + e

5.

43

= 1. 00

)(

+

1. 2 0 .2

×

)

Forward Pass

Calculating the output of Hidden Layer

5.1

3.5

1.4

0.2

H 1 = 0. 96 H 2 = 0.99 H 3 = 1 .0
H
1 = 0. 96
H
2 = 0.99
H
3 = 1 .0

Iris-setosa

Iris-versicolor

Initialization

w j i , w 0.1 k , j 0.1 0.2 0.1 0.3 0.1 0.4
w
j i
,
w
0.1
k
,
j
0.1
0.2
0.1
0.3
0.1
0.4
0.2
0.5
0.2
0.3
0.6
0.4
0.7
0.2
0.8
0.5
0.3
0.6
0.9
k
1.0
1.1
j
1.2
i

Forward Pass

Calculating the output of Output Layer

net

1

=

=

=

0. 1

+

(

θ

1

0.1 0 .96

×

+

w

1, j

H

j

j

)(

+

0.3 0.99

×

0

.99

O

1

=

Err

1

f ( net

1

)

t

= −

1

O

1

1

= 0 . 73

=

0

. 99

1 + e 1 0 . 73 0 . 27

=

= −

)(

+

0.5 1.0

×

)

Forward Pass

Calculating the output of Output Layer

net

2

=

=

=

0.2

+

(

θ

2

0 .2 0. 96

×

+

w

2 , j

H

j

j

)(

+

0.4 0.99

×

)(

+

O

2

=

Err

2

f ( net

2

)

=

t

2

O

2

1

.39

1

=

1 + e 0 0. 80

1 .39

= −

= 0. 80

=−

0. 80

0 .6 1.0

×

)

Forward Pass

Calculating the output of Output Layer

5.1

3.5

1.4

0.2

I = 5. 1 1 H 1 = 0 . 96 I = 3.5 2
I = 5. 1
1
H
1 = 0 . 96
I = 3.5
2
H
= 0 . 99
2
I = 1.4
3
H
3 = 1 .0
I = 0. 2
4

O

1

= 0. 73

Iris-setosa

t

1

= 1

Iris-versicolor

O

2

= 0.80

t

2

= 0

Training Process: Backward Pass

1. Calculate the δ of Output Layer k = O (1 −O t −O )(
1. Calculate the
δ
of Output Layer
k
= O
(1
−O t −O
)(
)
δ k
k
k
k
k

2. Calculate the

δ

j

δ

j

=

H

of Hidden Layer

j

(1

H

j

)

k

w δ

k

,

j

k

3. Update the weight between Hidden & Output Layer

Δw = ηδ H

k

,

j

k

j

w

k

,

(

j new

)

=

w

k

,

j old

(

)

w

k

,

j

4. Update the weight between Input & Hidden Layer

Δw = ηδ I

j i

,

j

i

w

j i

,

(

new

)

=

w

j i

,

(

old

)

w

j i

,

Backward Pass

Update the weight between Hidden & Output Layer

H 1 H 2 H 3
H
1
H
2
H
3

= 0 . 96

0.1

0.2

0.3

0.4

0.5

0.6

H 1 H 2 H 3 = 0 . 96 0.1 0.2 0.3 0.4 0.5 0.6

= 0.99

H 1 H 2 H 3 = 0 . 96 0.1 0.2 0.3 0.4 0.5 0.6

= 1 .0

O

t

1

O

2

t 2

1

= 0. 73

= 1

= 0.80

= 0

Backward Pass

δ= t O O O = −

1

(

1

1

)

1

(1

1

) (1 0 . 73) 0 . 73 (1 0 . 73) 0 . 053

×

×

=

δ = t O O O =

2

(

2

2

)

2

(1

2

) ( 0 0 .80) 0.80 (1 0.80)

×

×

=−

0 .128

H 1 H 2 H 3
H
1
H
2
H
3

= 0. 96

0.1

0.2

0.3

0.4

0.5

0.6

H 1 H 2 H 3 = 0. 96 0.1 0.2 0.3 0.4 0.5 0.6 =

= 0 . 99

2 H 3 = 0. 96 0.1 0.2 0.3 0.4 0.5 0.6 = 0 . 99

= 1 .0

δ=

1

0

. 053

O

t

1

1

= 0. 73

= 1

O

2

= 0.80

t 2

= 0

δ =− 0.128

2

Training Process: Backward Pass

1. Calculate the

δ

k

δ

k

of Output Layer

= O

k

(1

)(

O t O

k

k

k

)

δ 2. Calculate the of Hidden Layer j δ = H (1 − H )
δ
2. Calculate the
of Hidden Layer
j
δ
=
H
(1 −
H
)
w δ
j
j
j
k
,
j
k
k

3. Update the weight between Hidden & Output Layer

Δw = ηδ H

k

,

j

k

j

w

k

,

(

j new

)

=

w

k

,

j old

(

)

w

k

,

j

4. Update the weight between Input & Hidden Layer

Δw = ηδ I

j i

,

j

i

w

j i

,

(

new

)

=

w

j i

,

(

old

)

w

j i

,

Backward Pass

(

)

δ= H H δ w +δ w =

1

1

(1

1

)

1 1,1

2

2 ,1

δ

2

δ

3

(

)

= H H δ w +δ w =

2

(1

2

)

1

1, 2

2

2 , 2

(

= H H δ w +δ w

3

(1

3

)

1

1, 3

2

2 , 3

0 .96 (1 0 . 96) ( 0. 053 0 .1 ( 0. 128) 0 . 2)

×

×

× +−

×

=−

0 .0007 8

0. 99 (1 0 . 996) ( 0 . 053 0 . 3 ( 0 . 128) 0 . 4)

×

×

× +−

×

=−

)

= ×

1 (1 1) ( 0 . 053 0 .5 ( 0 . 128) 0 . 6 ) 0

×

× +−

×

=

0 . 00035

δ=− 0 .00078 1 H = 0. 96 1 0.1 δ= 0 . 053 1
δ=− 0 .00078
1
H
= 0. 96
1
0.1
δ=
0
. 053
1
δ =− 0 . 00035
O
= 0. 73
0.2
2
1
H
= 0 . 99
2
t
= 1
0.3
1
0.4
O
= 0.80
0.5
2
t
= 0
0.6
2
δ =− 0.128
δ =
0
2
3
H
= 1 .0
3

Training Process: Backward Pass

1. Calculate the

δ

k

2. Calculate the

δ

k

of Output Layer

= O

k

(1

)(

O t O

k

k

k

δ

j

of Hidden Layer

)

δ

j

=

H

j

(1

H

j

)

k

w δ

k

,

j

k

3. Update the weight between Hidden & Output Layer Δw = ηδ H w =
3. Update the weight between Hidden & Output Layer
Δw = ηδ H
w
=
w
w
k
,
j
k
j
k
,
j new
(
)
k
,
j old
(
)
k
,
j

4. Update the weight between Input & Hidden Layer

Δw = ηδ I

j i

,

j

i

w

j i

,

(

new

)

=

w

j i

,

(

old

)

w

j i

,

Backward Pass

Update the weight between Hidden & Output Layer

Δw = ηδ H = ×

1,1

1

1

0.1 0.053 0.96 0.0051

×

=

w

1,1

= w w = +

1,1

1,1

0.1 0.0051 0.1051

=

Δw = ηδ H = ×

1, 2

1

2

0.1

0.053 0.99 0.0052

×

=

w

1, 2

= w w = +

1, 2

1, 2

0.3 0.0052 0.3052

=

Δw = ηδ H = ×

1, 3

1

3

0.1

0.053 1.00 0.0053

×

=

w

1, 3

= w w = +

1, 3

1, 3

0.5 0.0053 0.5053

=

Backward Pass

Update the weight between Hidden & Output Layer

Δw = ηδ H = × −

2 ,1

2

1

0.1 ( 0.128) 0.96

×

= −

0.0123

w

2 ,1

= w w = −

2 ,1

2 ,1

0.2 0.0123 0.188

=

Δw = ηδ H = × −

2 , 2

2

2

0.1

( 0.128) 0.99

×

= − 0.0127

w

2 , 2

= w w = −

2 , 2

1, 2

0.4

0.0127 0.387

=

Δw = ηδ H = × −

2 , 3

2

3

0.1

( 0.128) 1.00

×

= − 0.0128

w

2 , 3

= w w = −

2 , 3

2 , 3

0.6

0.0128 0.587

=

Backward Pass

i j

 

w

ji

( old )

I

i

Δw

ji

w ( new)

ji

       

-

 

1 1

 

0.1

5.1

0.000398

0.099602

       

-

 

2 1

 

0.4

3.5

0.000273

0.399727

       

-

 

3 1

 

0.7

1.4

0.000109

0.699891

       

-

 

4 1

 

1.0

0.2

0.000016

0.999984

       

-

 

1 2

 

0.2

5.1

0.000179

0.199822

       

-

 

2 2

 

0.5

3.5

0.000123

0.499878

       

-

 

3 2

 

0.8

1.4

0.000049

0.799951

       

-

 

4 2

 

1.1

0.2

0.000007

1.099993

 

1 3

 

0.3

5.1

0

0.3

2 3

   

0.6

3.5

0

0.6

 

3 3

 

0.9

1.4

0

0.9

4 3

   

1.2

0.2

0

1.2

Update the weight between Input & Hidden Layer

Δ =−

w

ji

η

E

w

ji

= ηδ

j

x

i

w new

= w w

old

ji