Sie sind auf Seite 1von 34

Introduction to Simulation - Lecture 10

Modified Newton Methods


Jacob White

Thanks to Deepak Ramaswamy, Jaime Peraire, Michal


Rewienski, and Karen Veroy

Outline
Damped Newton Schemes
Globally Convergent if Jacobian is Nonsingular
Difficulty with Singular Jacobians

Introduce Continuation Schemes


Problem with Source/Load stepping
More General Continuation Scheme

Improving Continuation Efficiency


Better first guess for each continuation step

Arc Length Continuation


SMA-HPC 2003 MIT

Multidimensional
Newton Method

Newton Algorithm

Newton Algorithm for Solving F ( x ) = 0


x = Initial Guess, k = 0
0

Repeat {

( ) ( )
( x )( x x ) = F ( x )

Compute F x k , J F x k
Solve J F

k +1

for x k +1

k = k +1

} Until
SMA-HPC 2003 MIT

x k +1 x k , F x k +1

small enough

Multidimensional
Newton Method

Multidimensional
Convergence Theorem

Theorem Statement

Main Theorem
If

( )

( Inverse is bounded )

a)

J F1 x k

b)

JF ( x) JF ( y) A x y

( Derivative is Lipschitz Cont )

Then Newtons method converges given a sufficiently


close initial guess

SMA-HPC 2003 MIT

Multidimensional
Newton Method

Multidimensional
Convergence Theorem

Implications

If a functions first derivative never goes to zero, and its


second derivative is never too large
Then Newtons method can be used to find the zero
of the function provided you all ready know the
answer.
Need a way to develop Newton methods which
converge regardless of initial guess!

SMA-HPC 2003 MIT

Non-converging
Case

1-D Picture

f(x)

x1
x

Limiting the changes in X might improve convergence


SMA-HPC 2003 MIT

Newton Method
with Limiting

Newton Algorithm

Newton Algorithm for Solving F ( x ) = 0


x = Initial Guess, k = 0
0

Repeat {

( ) ( )
( x ) x = F ( x )
+ limited ( x )

Compute F x k , J F x k
Solve J F
x k +1 = x k
k = k +1

} Until
SMA-HPC 2003 MIT

k +1

for x k +1

k +1

x k +1 , F x k +1

small enough

Damped Newton
Scheme

Newton Method
with Limiting
General Damping Scheme

( )

( )

Solve J F x k x k +1 = F x k

for x k +1

x k +1 = x k + k x k +1

Key Idea: Line Search

Pick to minimize F x + x
k

F x + x
k

k +1

2
2

k +1

F x + x
k

2
2
k +1

) F (x
T

+ k x k +1

Method Performs a one-dimensional search in


Newton Direction
SMA-HPC 2003 MIT

Newton Method
with Limiting

Damped Newton

Convergence Theorem

If
a)

J F1 ( x k )

b)

JF ( x) JF ( y) A x y

( Inverse is bounded )
( Derivative is Lipschitz Cont )

Then
There exists a set of k ' s ( 0,1] such that

F ( x k +1 ) = F ( x k + k x k +1 ) < F ( x k ) with <1

Every Step reduces F-- Global Convergence!


SMA-HPC 2003 MIT

Damped Newton
Newton Method
with Limiting
Nested Iteration
x 0 = Initial Guess, k = 0
Repeat {

( ) ( )
Solve J ( x ) x = F ( x ) for x
Find ( 0,1] such that F ( x + x )
Compute F x k , J F x k
k

k +1

k +1

x k +1 = x k + k x k +1
k = k +1

} Until

SMA-HPC 2003 MIT

x k +1 , F x k +1

small enough

k +1

is minimized

Newton Method
with Limiting

v1
1v

v2

10

+
- Vd

Damped Newton

Example

1
I r Vr = 0
10
Vd

I d I s (e

Vt

1) = 0

Nodal Equations with Numerical Values

f ( v2 )

( v 0)
v2 1)
(
16
0.025
=
+ 10 (e
1) = 0
2

10

Newton Method
with Limiting

f ( v2 )

Damped Newton

Example cont.

( v 0)
v2 1)
(
16
0.025
=
+ 10 (e
1) = 0
2

10

Damped Newton
Newton Method
with Limiting
Nested Iteration
x 0 = Initial Guess, k = 0
Repeat {

( ) ( )
Solve J ( x ) x = F ( x ) for x
Find ( 0,1] such that F ( x + x )
Compute F x k , J F x k
k

k +1

k +1

x k +1 = x k + k x k +1
k = k +1

} Until

x k +1 , F x k +1

k +1

is minimized

small enough

How can one find the damping coefficients?


SMA-HPC 2003 MIT

Newton Method
with Limiting

Damped Newton

Theorem Proof

By definition of the Newton Iteration

k +1

=x -
k

( )
k

( )

JF x
F xk


Newton Direction

Multidimensional Mean Value Lemma

F ( x ) F ( y ) J F ( y )( x y )

A
x y
2

Combining

F x

k +1

) F (x )+ J (x )
k

SMA-HPC 2003 MIT

( )

k J x k
F

A
F x k J F xk
2

( )
k

( )

( )

F x

Newton Method
with Limiting

Damped Newton

Theorem Proof-Cont

From the previous slide

F x

k +1

) F (x )+ J (x )
k

( )

J x
F

A k

F x
J F xk
2

( )

( )

Combining terms and moving scalars out of norms

F x

k +1

) (1 ) F ( x ) ( )
k

A
J F xk
2

( )

( )

F x

Using the Jacobian Bound and splitting the norm


2
2
2 A

k +1
k
k
k
k
F ( x ) (1 ) F ( x ) + ( )
F (x )

Yields a quadratic in the damping coefficient


SMA-HPC 2003 MIT

( )

F x

Newton Method
with Limiting

Damped Newton

Theorem Proof-Cont-II

Simplifying quadratic from previous slide

F x

k +1

1 k + k

( )

2A
2

( )

F x

k
F
x

( )

Two Cases:
1)

2A
2

( )

F xk

<

1
2

Pick k = 1 (Standard Newton)

2
2 A

k
k
1 +
F xk
2

2A
1
1
2)
Pick k = 2
F xk >
2
2
A F

( )

( )

( )

1 k + k

( )

SMA-HPC 2003 MIT

2A
2

(x )

( )

F xk

1
<
2
k

1
1
<

2 2A F x k

( )

Newton Method
with Limiting

Damped Newton

Theorem Proof-Cont-III

Combining the results from the previous slide

( )

F x k +1 k F x k

not good enough, need independent from k

The above result does imply

( )

F x k +1 F x 0

not yet a convergence theorem

A
1
For the case where
F ( xk ) >

( )

2 2A F x k

( )

2 2A F x0

Note the proof technique


First Show that the iterates do not increase
Second Use the non-increasing fact to prove convergence
SMA-HPC 2003 MIT

Damped Newton
Newton Method
with Limiting
Nested Iteration
x 0 = Initial Guess, k = 0
Repeat {

( ) ( )
Solve J ( x ) x = F ( x ) for x
Find ( 0,1] such that F ( x + x )
Compute F x k , J F x k
k

k +1

k +1

x k +1 = x k + k x k +1
k = k +1

} Until

x k +1 , F x k +1

k +1

is minimized

small enough

Many approaches to finding


SMA-HPC 2003 MIT

Newton Method
with Limiting

Damped Newton

Singular Jacobian Problem

f(x)

x2
1

1
D

Damped Newton Methods push iterates to local minimums


Finds the points where Jacobian is Singular
SMA-HPC 2003 MIT

Continuation Schemes

Source or Load-Stepping

Newton converges given a close initial guess

Basic Concepts

Generate a sequence of problems


Make sure previous problem generates guess for next problem

Heat-conducting bar example

1. Start with heat off, T= 0 is a very close initial guess


2. Increase the heat slightly, T=0 is a good initial guess
3. Increase heat again
SMA-HPC 2003 MIT

Continuation Schemes

Basic Concepts

General Setting

Solve F ( x ( ) , ) = 0 where:
a) F ( x ( 0 ) , 0 ) = 0 is easy to solve Starts the continuation
b) F ( x (1) ,1) = F ( x )

Ends the continuation

c) x ( ) is sufficiently smooth Hard to insure!


x ( )

Dissallowed
0
SMA-HPC 2003 MIT

Continuation Schemes

Basic Concepts

Template Algorithm

Solve F ( x ( 0 ) , 0 ) , x ( prev ) = x ( 0 )
=0.01, =

While < 1 {

x 0 ( ) = x ( prev )
Try to Solve F ( x ( ) , ) = 0 with Newton

If Newton Converged
x ( prev ) = x ( ) , = + , = 2
Else
1
= , = prev +
2

SMA-HPC 2003 MIT

Basic Concepts

Continuation Schemes

R
Vs

+
-

v
Diode

Source/Load Stepping Examples

1
f ( v ( ) , ) = idiode ( v ) + ( v Vs ) = 0
R
f ( v, )
v

fL

idiode ( v )

G

F ( x, ) =

1
+
Not dependent!
R

f x ( x, y ) = 0
f y ( x, y ) + f l = 0

Source/Load Stepping Does Not Alter Jacobian


SMA-HPC 2003 MIT

Jacobian Altering Scheme

Continuation Schemes

Description

F ( x ( ) , ) = F ( x ( ) ) + (1 ) x ( )
Observations
=0 F ( x ( 0 ) , 0 ) = x ( 0 ) = 0
F ( x ( 0 ) , 0 )
x

=I

=1 F ( x (1) ,1) = F ( x (1) )


F ( x ( 0 ) , 0 )
x
SMA-HPC 2003 MIT

F ( x (1) )
x

Problem is easy to solve and


Jacobian definitely nonsingular.

Back to the original problem


and original Jacobian

Continuation Schemes

Jacobian Altering Scheme

Basic Algorithm

Solve F ( x ( 0 ) , 0 ) , x ( prev ) = x ( 0 )
=0.01, =

While < 1 {

x 0 ( ) = x ( prev ) + ?
Try to Solve F ( x ( ) , ) = 0 with Newton

If Newton Converged
x ( prev ) = x ( ) , = + , = 2
Else
1
= , = prev +
2

SMA-HPC 2003 MIT

Jacobian Altering Scheme

Continuation Schemes

Initial Guess for each step.

x()
x ( + )

Initial Guess Error

x0 ( + ) = x ( )
0

SMA-HPC 2003 MIT

Jacobian Altering Scheme

Continuation Schemes

Update Improvement

F ( x ( + ) , + ) F ( x ( 0) , ) +
F ( x ( ) , )
x

( x ( + ) x ( ) )

F ( x ( ) , )

F ( x ( ) , )
x
Have From last
steps Newton

SMA-HPC 2003 MIT

x 0 ( + ) x ( ) =
Better Guess
for next steps
Newton

F ( x ( ) , )

Continuation Schemes

Jacobian Altering Scheme

Update Improvement Cont.

If
F ( x ( ) , ) = F ( x ( ) ) + (1 ) x ( )
Then

F ( x, )
= F ( x) x ( )

Easily Computed
SMA-HPC 2003 MIT

Jacobian Altering Scheme

Continuation Schemes

Update Improvement Cont. II.


 ( x ( ) , )

x 0 ( + ) = x ( )

F ( x ( ) , )

Graphically

x()
x0 ( + )
0

SMA-HPC 2003 MIT

Continuation Schemes

Jacobian Altering Scheme

Still can have problems

x()

Must switch back to


increasing lambda

Arc-length
steps

1
lambda steps

SMA-HPC 2003 MIT

Must switch from


increasing to
decreasing lambda

Continuation Schemes

Jacobian Altering Scheme

Arc-length Steps?

x()

arc-length

Arc-length
steps

( x ) + ( )
2

Must Solve For Lambda


F ( x, ) = 0
2

+
x

arc
=0
( prev )
( prev )
2

2
2

SMA-HPC 2003 MIT

Jacobian Altering Scheme

Continuation Schemes

Arc-length steps by Newton

F x k , k

k
2
x
x ( prev )

k
prev

SMA-HPC 2003 MIT

k
k

F x ,
x k +1 x k

=
k +1
k

k
2 prev

)
x ( )

k
k

F x ,

+ x

prev

2
2

arc 2

Jacobian Altering Scheme

Continuation Schemes

Arc-length Turning point

x( )

What happens here?


0

Upper left-hand
Block is singular

SMA-HPC 2003 MIT

F x k , k

k
x ( prev )
2
x

F x k , k

k
2 prev

Summary
Damped Newton Schemes
Globally Convergent if Jacobian is Nonsingular
Difficulty with Singular Jacobians

Introduce Continuation Schemes


Problem with Source/Load stepping
More General Continuation Scheme

Improving Efficiency
Better first guess for each continuation step

Arc-length Continuation
SMA-HPC 2003 MIT

Das könnte Ihnen auch gefallen