Beruflich Dokumente
Kultur Dokumente
Introduction to Non-Linear
Optimization
Optimization Tree
What is Optimization?
What is Optimization?(Cont.)
Figure 6: Global versus local optimization. Figure 7: Local point is equal to global point if
the function is convex.
Mathematical Background
Slop or gradient of the objective function f – represent the
direction in which the function will decrease/increase most rapidly
df f ( x x) f ( x) f
lim lim
dx x0 x x 0 x
Optimization Algorithm
Optimization Methods
Deterministic
• Direct Search – Use Objective function values to locate minimum
• Gradient Based – first or second order of objective function.
• Minimization objective function f(x) is used with –ve sign –
f(x) for maximization problem.
Single Variable
• Newton – Raphson is Gradient based technique (FOC)
• Golden Search – step size reducing iterative method
Multivariable Techniques ( Make use of Single variable Techniques
specially Golden Section)
• Unconstrained Optimization
a.) Powell Method – Quadratic (degree 2) objective function polynomial is
non-gradient based.
b.) Gradient Based – Steepest Descent (FOC) or Least Square minimum
(LMS)
c.) Hessian Based -Conjugate Gradient (FOC) and BFGS (SOC)
© 2008 Solutions 4U Sdn Bhd. All Rights Reserved
Fundamental of Optimization
• Constrained Optimization
a.) Indirect approach – by transforming into unconstrained
problem.
b.) Exterior Penalty Function (EPF) and Augmented Lagrange
Multiplier
c.) Direct Method Sequential Linear Programming (SLP), SQP and
Steepest Generalized Reduced Gradient Method (GRG)