Sie sind auf Seite 1von 23

N ONLINEAR E QUATIONS

Modeling Simulation and Optimization

Ramesh Vulavala
D. J. Sanghvi College of Engineering

May 12, 2017

Ramesh Vulavala DJSCE Mumbai 1


N ONLINEAR E QUATIONS

T OPICS C OVERED

N ONLINEAR E QUATIONS
Introduction
Newtons Method
Armijo Line Search
Successive Substitution

Ramesh Vulavala DJSCE Mumbai 2


N ONLINEAR E QUATIONS

I NTRODUCTION
I Sets of nonlinear equations appear in the steady state
simulation of chemical processes:
f (x) = 0
where f (x) is the vector of nonlinear functions:

f1 (x)
f2 (x)
f (x) = .

. .
fn (x)
I The unknowns are represented by the variable vector:

x1
x2
x= .

..
xn
I Vulavala DJSCE Mumbai
Ramesh 3
N ONLINEAR E QUATIONS

N EWTON S M ETHOD
Newtons method is commonly employed to solve nonlinear
equations. As an example, let us consider the solution of two
equations in two unknowns:

f1 (x1 , x2 ) = 0
f2 (x1 , x2 ) = 0

Expanding the two functions around an initial estimate of the


roots, in Taylor Series, we get:
   
f1 f1
f1 (x1 , x2 ) = f1 (x1 ,
x2 ) + (x1 x1 ) + (x2 x2 )
x1 x2
   
f2 f2
f2 (x1 , x2 ) = f2 (x1 ,
x2 ) + (x1 x1 ) + (x2 x2 )
x1 x2

Ramesh Vulavala DJSCE Mumbai 4


N ONLINEAR E QUATIONS

N EWTON S M ETHOD

I In this expansion only the first order terms are kept.


I The partial derivatives are evaluated at the initial guess
vector: x = [x1 x2 ]T Setting the expansions equal to zeros,
and expressing the two equations in matrix form, we can
write:
   
f1 f1    
x1 x 2 (x 1
x 1 ) f 1 (
x1 ,
x2 )
    =
f2 f2 (x2 x2 ) f2 (x1 , x2 )
x1 x2

Ramesh Vulavala DJSCE Mumbai 5


N ONLINEAR E QUATIONS

N EWTON S M ETHOD

I The first matrix is the Jacobian, evaluated at the initial


guess vector.
I The multiplying vector is the incremental variations in the
variables.
I The vector on the RHS is the function values evaluated at
the initial guess vector.
I For the general case, it can be written as:

J(x)x = f (x)

Ramesh Vulavala DJSCE Mumbai 6


N ONLINEAR E QUATIONS

N EWTON S M ETHOD

     
f1 f1 f1
...
 x1   x2   xn 

f2 f2 f2
...

x1 x2 xn
J=
.. .. .. ..


 .   .  .  . 

fn fn fn
x1 x2 ... xn

(x1 x1 ) x1
(x2 x2 ) x2
x = =

.. ..
. .
(xn xn ) xn

Ramesh Vulavala DJSCE Mumbai 7


N ONLINEAR E QUATIONS

N EWTON S M ETHOD

f1 (x)
f2 (x)
f (x) =

..
.
fn (x)

I The increments in each iteration can be obtained from:

x = J(x)1 f (x)

I In general, the iterative scheme is represented by:

xk+1 = xk J(xk )1 f (xk )

I The problem is solved as a set of Linear Equations, using


standard Gaussian Elimination procedures.
I The initial values are very crucial for convergence.
Ramesh Vulavala DJSCE Mumbai 8
N ONLINEAR E QUATIONS

N EWTON S M ETHOD

I Possible criteria for termination of the algorithm:

xki i for i = 1, 2, . . . , n
|xk+1
i xki | i for i = 1, 2, . . . , n
|fi (xk )| i for i = 1, 2, . . . , n
|fi (xk+1 ) fi (xk )| i for i = 1, 2, . . . , n

Ramesh Vulavala DJSCE Mumbai 9


N ONLINEAR E QUATIONS

A RMIJO L INE S EARCH


Newtons Method can be written as:

xk+1 = xk + pk

where (0, 1) and pk is the directional change predicted by


Newtons Method.
Define the function:
1
(x) = f (x)T f (x)
2
So minimizing (x) would automatically lead us towards
f (x) 0 Using Taylor Series to expand (x) around xk , we can
write:
2 k T
(xk+1 ) = (xk ) + (xk )T (pk ) + (p ) H(xk )pk + . . .
2
Ramesh Vulavala DJSCE Mumbai 10
N ONLINEAR E QUATIONS

A RMIJO L INE S EARCH


Since (x) = f (x)T f (x)/2 we get:
 
1 f1 (x)
(x) = [f1 (x)f2 (x)]
2 f2 (x)

Or
f1 (x)2 + f2 (x)2
(x) =
2
The gradient is given by:
" (x)
#
x1
(x) = (x)
x2
 
(x) 1 f1 (x) f2 (x)
= 2f1 (x) + 2f2 (x)
x1 2 x1 x1
Ramesh Vulavala DJSCE Mumbai 11
N ONLINEAR E QUATIONS

A RMIJO L INE S EARCH


 
(x) 1 f1 (x) f2 (x)
= 2f1 (x) + 2f2 (x)
x2 2 x2 x2
Combining the two expressions, we can write:
" f1 (x) f2 (x)
#
f1 (x) x + f2 (x) x1
(x) = 1
f1 (x) f2 (x)
f1 (x) x 2
+ f2 (x) x2

This can be written as:


" f1 (x) f2 (x)
# 
x1 x1 f1 (x)
(x) = f1 (x) f2 (x) f2 (x)
x2 x2

Or, simply:
(x) = J(x)T f (x)
Ramesh Vulavala DJSCE Mumbai 12
N ONLINEAR E QUATIONS

A RMIJO L INE S EARCH

The Newton step is given by:

pk = J(xk )1 f (xk )

Now, consider the expression:

(xk )T pk = f (xk )T J(xk )J(xk )1 f (xk )

Which simplifies to:

(xk )T pk = f (xk )T f (xk )

Or, simply:
(xk )T pk = 2(xk ) < 0

Ramesh Vulavala DJSCE Mumbai 13


N ONLINEAR E QUATIONS

A RMIJO L INE S EARCH

Hence, we can write, from the Taylor series expansion,


considering a very small :

(xk+1 ) (xk ) = 2(xk ) < 0

So, for a sufficiently small , the Newton step will reduce


(x).

This is an important descent property.

Ramesh Vulavala DJSCE Mumbai 14


N ONLINEAR E QUATIONS

A RMIJO L INE S EARCH

Since
(xk+1 ) (xk )
= 2(xk )

we can write:
(xk )

= 2(xk )
=0

Ramesh Vulavala DJSCE Mumbai 15


N ONLINEAR E QUATIONS

A RMIJO L INE S EARCH

I Set = 1
I Evaluate (xk + pk )
I If (xk + pk ) (xk ) 2(xk ) then the step size has
been found. Set xk+1 = xk + pk and check for convergence.
I Else: Let = max{, q } where

(xk )
q =
(2 1)(xk ) + (xk + pk )

Set = and iterate.


I Normally = 0.1 and = 0.1

Ramesh Vulavala DJSCE Mumbai 16


N ONLINEAR E QUATIONS

A RMIJO L INE S EARCH


Consider the following two equations:

f1 (x) = 2x21 + x22 6 = 0

f2 (x) = x1 + 2x2 3.5 = 0


Taking the partial derivatives, we get:

f1 (x) f1 (x)
= 4x1 , = 2x2
x1 x2

f2 (x) f2 (x)
= 1, =2
x1 x2
Therefore the Jacobian is given by:
 
4x1 2x2
J(x) =
1 2
Ramesh Vulavala DJSCE Mumbai 17
N ONLINEAR E QUATIONS

A RMIJO L INE S EARCH


The inverse of the Jacobian is given by:
 
1 1 2 2x2
J(x) =
(8x1 2x2 ) 1 4x1

The results are as follows:

x1 x2 f1 (x) f2 (x)
0.62630000 1.25270000 3.64623933 0.36830000 0.100000
0.88058103 1.14397448 3.14047647 0.33147000 0.548016
1.51683230 0.91667432 0.55814775 0.14981907 1.000000
1.59853307 0.95073347 0.01451006 0.00000000 1.000000
1.59586744 0.95206628 0.00001599 0.00000000 1.000000
1.59586450 0.95206775 0.00000000 0.00000000 1.000000

Ramesh Vulavala DJSCE Mumbai 18


N ONLINEAR E QUATIONS

S UCCESSIVE S UBSTITUTION
Here the given problem of solving for the roots of a set of
non-linear equations given by:
f (x) = 0
is re-formulated as:
xk+1 = g(xk )
where g(xk ) is a vector of functions obtained from the original
functions. Now, using Taylor series , we can expand the
functions as:
g(xk ) = g(xk1 ) + J(xk1 )T xk + . . .
where
(xk1 xk1 xk1

1 )
(x2 xk1
2 )
xk2
xk = =

.. ..
. .
(xn xk1
n ) xkn
Ramesh Vulavala DJSCE Mumbai 19
N ONLINEAR E QUATIONS

S UCCESSIVE S UBSTITUTION
     
g1 g1 g1
...
 x1   x2   xn 

g2 g2 g2
...

x1 x2 xn
J(x) =
.. .. .. ..


 .   .  .  . 

gn gn gn
x1 x2 ... xn

If the Jacobian does not vanish, and is fairly constant near the
solution, we can write:

xk+1 xk = g(xk ) g(xk1 ) = J(xk )T xk

Or, more compactly:

xk+1 = xk where = J(xk )T

Ramesh Vulavala DJSCE Mumbai 20


N ONLINEAR E QUATIONS

S UCCESSIVE S UBSTITUTION

Taking the Norm on both sides, we have:

kxk+1 k kk kxk k

If we use the Euclidean Norm, then: kk = ||max


Applying the relation recursively, we get:

kxk k = (||max )k kx0 k

The Necessary and Sufficient condition is that ||max < 1

Ramesh Vulavala DJSCE Mumbai 21


N ONLINEAR E QUATIONS

S UCCESSIVE S UBSTITUTION

Consider the problem:

x1 = 1 0.5 exp(0.7(1 x2 ) 1)

x2 = 1 0.3 exp(0.5(x1 + x2 ))
Start with x0 = [0.8 0.8]T and estimate the maximum eigenvalue
from the iterates.

Ramesh Vulavala DJSCE Mumbai 22


N ONLINEAR E QUATIONS

S UCCESSIVE S UBSTITUTION

The results are as follows:

x1 x2 g1 (x) g2 (x)
1.03850300 1.11778000 0.83061705 1.11823642
0.83061705 1.11823642 0.83067116 1.20510565
0.83067116 1.20510565 0.84066098 1.16979647
0.84066098 1.16979647 0.83667361 1.18024034
0.83667361 1.18024034 0.83786329 1.17758967
0.83786329 1.17758967 0.83756217 1.17819022

Max. Eigenvalue = 0.984201

Ramesh Vulavala DJSCE Mumbai 23