Beruflich Dokumente
Kultur Dokumente
SOFTWARE
as a Tool for Solving
Differential Equations Using
NEURAL NETWORKS
Fotiadis, D. I.
Karras, D. A.
Lagaris, I. E.
Likas, A.
Papageorgiou,
D. G.
DIFFERENTIAL
EQUATIONS HANDLED
• ODE’s
• Systems of ODE’s
• PDE’s ( Boundary and Initial
Value Problems )
• Eigen - Value PDE Problems
• IDE’s
ARTIFICIAL
NEURAL NETWORKS
Features Include:
• A Host of Optimization Algorithms
• Special Merit for Sums of Squares
• Variable Bounds and Variable Fixing
• Command Driven User Interface
• Numerical Estimation of Derivatives
• Dynamic Programming of Strategies
ARTIFICIAL
NEURAL NETWORKS
n ( 2 ) n (1)
N ( x , u , w , w , v ) = ∑ viσ ∑ wij σ ∑ w jk xk + u j
n3 2 1
(1) ( 2)
i =1 j =1 k =1
Activation Functions
0. 8
0. 6
0. 4
0. 2
0
-10 -5 0 5 10
The Sigmoidal
properties
dσ ( x )
= σ ( x )[1 − σ ( x )]
dx
2
d σ ( x)
= σ ( x )[1 − σ ( x )][1 − 2σ ( x )]
2
dx
FACTS
Kolmogorov
and Cybenko and Hornik
proved theorems concerning the
approximation capabilities of ANNs
The Model Ψ M
( x) = B( x) + Z ( x) N ( x)
satisfies by construction the B.C.
L Ψ ( x) − f ( x) = 0
M
∀x ∈[ 0,1]
N
Pick a set of representative
points x1 , x2 ,..., xn
in the unit Hypercube
∑ [L Ψ (x
M i
) − f(x i
)] 2
i = 1, n
ILLUSTRATION
Model
Ψ ( x) = Ψ (1 − x) + Ψ x + x(1 − x) N ( x)
M 0 1
ILLUSTRATION
where
B( x, y ) = (1 − x)Ψ (0, y ) + xΨ (1, y )
+ (1 − y ){Ψ ( x,0) − [(1 − x)Ψ ( x,0) + xΨ (1,0)]}
+ y{Ψ ( x,1) − [(1 − x)Ψ (0,1) + xΨ (1,1)]}
EXAMPLES
Exac
t
Ψ ( x, y) = e− x ( x + y3 )
GRAPHS &
COMPARISON
Neural Solution accuracy
Plot Points: Training Points
Ψ ( x, y ) − ΨM ( x, y )
GRAPHS &
COMPARISON
Neural Solution accuracy
Plot Points: Test Points
Ψ ( x, y ) − ΨM ( x, y )
GRAPHS &
COMPARISON
Ψ ( x, y ) − ΨFE ( x, y )
GRAPHS &
COMPARISON
Ψ ( x, y ) − ΨFE ( x, y )
PERFORMANCE
Problem: L Ψ ( x) = λ Ψ ( x )
With appropriate Dirichlet BC
∑ [ LΨ
i =1
M ( xi ) − λΨM ( xi )]
2
∑ [Ψ
i =1
M
2
( xi )]
EIGEN VALUE
PROBLEMS
∑ Ψ ( x ) LΨ ( x )
i i
Where: λ= i =1
n
∑ i
[ Ψ
i =1
( x )]2
ψ (0) = 0, ψ (r ) ~ e , k > 0 − kr
Model: ΨM ( x) = N ( x)
“Error” to be minimized:
Domain terms + Boundary terms
n m
∑ [ LΨ
i =1
M (ri ) − f (ri )] + β ∑ [Ψ M ( Ri ) − bi ]
2
i =1
2
i =1
∑a e
k =1
k
−λ Ri −Rk 2
= bi − N ( Ri )
“Error”: ∑ Ψ − 2
[ L M ( ri ) f ( ri )]
i =1
Pros & Cons .
..
The RBF - Synergy is:
• Computationally costly. A linear
system is solved each time the model is
evaluated.
• Exact in satisfying the BC.
• Computationally efficient
IN PRACTICE . . .
Conclusions:
Experiments on several model
problems shows performance
similar to the one reported earlier.
GENERAL
OBSERVATIONS
Levenberg-Marquardt
• For Sum-Of-Squares
THE USER’S PART
http://nrt.cs.uoi.gr/merlin/
It is maintained, supported
and is FREELY available to
the scientific community.
FUTURE
DEVELOPMENTS
Hardware Implementation on
NEUROPROCESSORS