Sie sind auf Seite 1von 16

Linear equations

VIVIANA MARCELA BAYONA CARDENAS

NON-CONVENTIONAL METHODS TO SOLVE L.E


HOW DO WE SOLVE AN EQUATION SYSTEM?

 Algebraic methods like matrixes are used to


solve linear equation systems, furthermore, this
method is used to solve some another non-
linear system in which we need to give a
solution.
As a consequence of using matrixes the methods
to find solutions are the result of algebraic
solution for matrixes.
Eliminación
Gaussiana

Gauss con
Pivoteo
Direct Methods
Gauss-Jordan

Sistemas
Solution Especiales
Methods
Jacobi

Iterative Gauss-Seidel
Methods

Gauss-Seidel
with relaxation
These are example of matrices
Another example
A way to write matrices that is commonly to find at any place:

 f11 f12  f1n   y1   c1  Fy = c


f f 22     
f 2 n   y 2   c2  
 21
      
     
 f i1 fi 2  f in   yi   ci 
     
     
 f m1 f m2  f mn   yn  cm 

F y c
Thomas Method – thomas algorithm
 This method emerge as a simplification of LU factorization but only if we have a
tri-diagonal matrix .

 b1 c1   x1   r1  Note that a simply


a b2 c2  x  r  form to identify
when to use this
 2   2   2  method is when
 a3 b1 c3   x3   r3  your matrix is
    banded.
         
 an 1 bn 1 cn 1   xn 1  rn 1  We are going to
      solve the system as
 an bn   xn   rn 
usual as LU for
other matrices.

A x r

WE ALSO CAN SOLVE THIS METHOD AS A SIMPLIFICATION OF GAUSSIAN


As what is usual on LU we are going to say that A = LU and using
Doolitle where Lii=1 for i=1 till n, we finally have:

1  U11 U12  b1 c1 


L 1   U U   
 21   22 23 a b
 2 2 2 c 
 L32 1   U33 U34   a3 b3 c3 
    
           
 Ln1,n2 1   Un1,n1 Un1,n  an1 bn1 cn1
     
 Ln,n1 1  Un,n   an bn 

L U A

Note that the Lower matrix and the Upper were simplify as LU method
require but what we obtain for both of them are two diagonal of
numbers. Hence the way to solve had been simplified in order to find a
solution; specially L.
U 11  b1
an
Ln, n 1 
U n 1,n 1
U n 1, n  cn 1 Based on the matrix product showed
before we obtain these expressions
U n, n  bn  Ln,n 1U n 1, n
Donde,
a1  0 y cn  0

ak
Lk , k 1 
U k 1, k 1
Now scanning from k=2 till n we finally U k 1, k  ck 1
have
U k , k  bk  Lk , k 1U k 1, k
If LUx=r and Ux=d then Ld=r, hence:

1   d1   r1 
L 1    r 
 21 d
  2   2 
 L32 1   d 3   r3 
   
        
 Ln 1,n  2 1  d n 1  rn 1 
     
 Ln ,n 1 1  d n   rn 

L d r

d 1  r1
Base on a regressive From k  2 till n
substitution
d k  rk  L k , k  1 d k  1
Finally we solve Ux=d based on the regressive
substitution
U 11 U12   x1   d1 
 U U   x  d 
 22 23   2   2 
 U 33 U 34   x3   d 3 
   
         
 U n 1, n 1 U n 1, n   xn 1  d n 1 
     
 U nn   x n   d n 

U x d

Where , To k  n  1 till 1,
n

xn 
dn dk  U
j  k 1
kj x j

U n ,n xk 
U k ,k
CHOLESKY DECOMPOSITION
Is a decomposition of a symmetric, positive-definite matrix into the
product of a lower triangle matrix and its conjugate transpose. When
is applicable this method is twice as efficient as LU decomposition
for solving systems
T
U L

Ax  b
HENCE
LL x  b
T
What was mention before shows that:
L LT
 L11  L11 L21  Ln2,1 Ln1,1 Ln,1 
L   
 21 L22   L22  Ln2, 2 Ln1, 2 Ln, 2 
          
  
Ln2,1 Ln2,2  Ln2,n2   Ln2,n2 Ln1,n2 Ln,n2 
Ln1,1 Ln1,2  Ln1,n2 Ln1,n1   Ln1,n1 Ln,n1 
   
 Ln,1 Ln,2  Ln,n2 Ln,n1 Ln,n   Ln,n 

 a11 a12  a1,n2 a1,n1 a1,n 


a a2,n 
 21 a22  a2,n2 a2,n1
A =LLT
       
 
an2,1 an2,2  an2,n2 an2,n1 an2,n 
an1,1 an1,2  an1,n2 an1,n1 an1,n 
 
 an,1 an,2  an,n2 an,n1 an,n 

A
From the product of the nth row of L and the nth column LT
of we obtain that:
Ln,12  Ln, 2 2    Ln, n  2 2  Ln, n 12  Lnn 2  a nn
Lnn 2  ann  Ln,12  Ln, 2 2    Ln, n  2 2  Ln, n 12
n 1
2
Lnn  ann  j 1
Ln, j 2

n 1
Lnn  a nn  j 1
Ln, j 2

Once again
scanning
k 1 from k=1 till
Lkk  a kk  L
j 1
k, j
2
n we obtain
In the other way if we multiply the nth row of L with the (n-1) column
of LT we will have:

Ln,1 Ln 1,1  Ln, 2 Ln 1, 2    Ln, n  2 Ln 1, n  2  Ln, n 1 Ln 1, n 1  a n, n 1


an, n 1  Ln,1 Ln 1,1  Ln, 2 Ln 1, 2    Ln,n  2 Ln 1, n  2
Ln, n 1 
Ln 1, n 1
n2
Ln, n 1  a n, n 1  L
j 1
n , j Ln 1, j

scanning from k=1 till n we obtain


i 1
Lk ,i  ak ,i   Lk , j Li , j
j 1

donde 1  i  k  1
APPLICATIONS
 Linear least squares: Systems of the form Ax = b with A symmetric
and positive definite arise quite often in applications. For instance,
the normal equations in linear least squares problems are of this
form.

 Monte Carlo Simulation: The Cholesky decomposition is commonly


used in the Monte Carlo method for simulating systems with
multiple correlated variables: The matrix of inter-variable
correlations is decomposed, to give the lower-triangular L.

 Non-linear optimization: Non-linear multi-variate functions may


be minimized over their parameters using variants of Newton's
method called quasi-Newton methods.
BIBLIOGRAPHY
 CHAPRA, Steven C. y CANALE, Raymond P.:
Métodos Numéricos para Ingenieros. McGraw Hill
2002.
 http://en.wikipedia.org/wiki/Cholesky_deco

mposition#Applications
 http://math.fullerton.edu/mathews/n2003/C

holeskyMod.html

Das könnte Ihnen auch gefallen