Sie sind auf Seite 1von 45

Optimization Techniques

Topic: Karesh-Kuhn-Tucker Optimality Criteria

Dr. Nasir M Mirza


Email: nasirmm@yahoo.com

Optimality Criteria
Important question: How do we know that we

have found the optimum for f(x)?

Answer: Test the solution for the necessary


and sufficient conditions

Necessary Condition for Optimality:


The first order optimality condition for the minimum of f(x)
can be derived by considering linear expansion of the function
around the optimum point x* using Taylor Series:

f ( x) f ( x*) + f (x*) ( x x*)


T

f ( x) f ( x*) = f (x*) ( x x*)


T

Where f(x*) is the gradient of function f(x) and


x - x* is the distance.

Conditions for Optimality


Unconstrained Problems:

If the x* is a minimum point then


this condition can only be ensured if
f(x)=0; The gradient of f(x) must
vanish at the optimum.
Thus the first order necessary
condition for the minimum of a
function is that its gradient is zero at
the optimum.

f(x)
Local max

12

Inflection

0
-6

-4

Local min

-2

0
-4

-8

This condition is true for a


maximum point also and for any
other point where the slope is zero.
Therefore, it is only a necessary
condition and is not sufficient
condition.

-12

Local min

Graph of f(x) = x(-cos(1) sin(1) + sin(x))

Conditions for Optimality


Graph for Function and its derivative
f(x)
Local max

12
12

df(x)/dx

Inflection

0
-6

-4

Local min

-2

0
-4

-6

-4

-2

0
-4

-8

-8
-12

-12

Local min

Sufficient Condition for Optimality:


Unconstrained Problems
1.

F(x)=0; The gradient of F(x) must vanish at the optimum

2.

The second order condition for the minimum of f(x) can be derived by
considering the quadratic expansion of the function around the
optimum point (x*) using Taylor Series as following:

1 T 2
f ( x) f ( x*) + f (x*) d + d f ( x*)d +
2
T

2f(x*) is a Hessian Matrix of function f(x) and d = x x*.


For x* to be a local minimum f(x) f(x*) must be greater than or equal to Zero
in the neighborhood of x*. So, we must have

1 T 2
d f ( x*)d 0
2

Sufficient Condition for Optimality:


Unconstrained Problems
1.

f(x)=0; The gradient of f(x) must vanish at the optimum

2.

Hessian Matrix must be positive definite at the minimum.


2 F( x )

x
1

2 F( x )

H = x 2 x 1

2 F( x )

x n x 1

2 F( x )

x 1x n
2 F( x )
2 F( x )

x
x
x 2
2
n


2 F( x )
2 F( x )

2
x n x 2
x n

2 F( x )

x 1x 2

Conditions for Optimality


Unconstrained Problems

A positive definite Hessian at the minimum ensures only that


a local minimum has been found;

The minimum is the global minimum only if it can be shown


that the Hessian is positive definite for all possible values of
x. This would imply a convex design space.

Very hard to prove in practice !!!!

Optimality Conditions Unconstrained Case


Let x* be the point that we think is the minimum for f(x)
Necessary condition (for optimality):
f(x*) = 0
A point that satisfies the necessary condition is a stationary
point
It can be a minimum, maximum, or saddle point
How do we know that we have a minimum?
Answer: Sufficiency Condition:
The sufficient conditions for x* to be a strict local minimum
are:
f(x*) = 0
2f(x*) is positive definite

Example 1:
Find all stationary points for the following function. Using Optimality
conditions, classify them as minimum, maximum or inflection points.
The objective function is : -2x + x2 xy +2y2
The gradient vector :

The Hessian Matrix :

f
2 + 2 x y
x =
f x + 4 y
y

2 f / x 2 2 f / xy 2 1

2
f / yx 2 f / y 2 = 1 4

Example 1:
The first order optimality conditions:
Necessary conditions:
-2 + 2x y = 0
-x + 4y = 0 ;

x = 4y and then

-2 + 2(4y) y = 0; 7y = 2; y = 2/7 and x = 8/7.


Possible solution point is x = 1.14286 and y = 0.285714

Then let us apply the second optimality conditions: Hessian Matrix


must be positive definite at the minimum. Let us find principal
minors:
A1 = |a11| = 2 ; A2 = det H = 8 1 = 7 ; both are positive;
So, H is positive definite.

Example 1:
The function f(x, y) at the point x* = 1.14286, y* = 0.285714 is
f = -1.14286 ;
The point is minimum point.
Since Hessian matrix is positive define, we know the function is convex.
Therefore any minimum is a global minimum.

% Matlab program to draw contour of function


[X,Y] = meshgrid(-1:.1:2);
Z = -2.*X + X.*X - X.*Y + 2.*Y.*Y;
contour(X,Y,Z,100)

Example 1:
Contour graph using MATLAB
Graphical
presentation of
function and
minimum at point
(x*, y*)

Example 1:
Three D plot
It confirms the
solution as well that
a global minimum
exists here at point
(x*, y*)

% Matlab program to draw function


[X,Y] = meshgrid(-1:.1:2);
Z = -2.*X + X.*X - X.*Y + 2.*Y.*Y;
mesh(Z);

Example 1:
Important observations:
The minimum point does not change if we add a
constant to the objective function.
The minimum point does not change if we multiply the
objective function by a positive constant.
The problem changes from minimization to maximization
problem if we multiply the objective function by a
negative sign.
The unconstrained problem is a convex problem if the
object function is convex. For convex problems any local
minimum is also a global minimum.

What is an optimization problem:


Find values of the variables that minimize or maximize the
objective function while satisfying the constraints.
The standard form of the constrained optimization problem can be
written as:
Minimize: F(x)

objective function

Subject to: gj(x) 0

j=1, . . . , m

inequality constraints

hk(x) 0

k=1, . . . , l

equality constraints

xi lower xi xi upper
where x=(x1, x2, x3, x4 , x5 ,xn)

i=1, . . . , n side constraints


design variables

Conditions for Optimality


Unconstrained Problems
1.

F(x)=0; The gradient of F(x) must vanish at the optimum

2.

Hessian Matrix must be positive definite (i.e. all positive eigenvalues at


optimum point).

2 F( x )

2
x

2 F( x )

H = x 2 x 1

2 F( x )

x n x 1

2 F( x )

x 1x n
2 F( x )
2 F( x )

2
x 2 x n
x 2


2 F( x )
2 F( x )

2
x n x 2
x n

2 F( x )

x 1x 2

Conditions for Optimality


Unconstrained Problems

A positive definite Hessian at the minimum ensures only that


a local minimum has been found;

The minimum is the global minimum only if it can be shown


that the Hessian is positive definite for all possible values of
x. This would imply a convex design space.

Very hard to prove in practice !!!!

Optimality Conditions Unconstrained Case


Let x* be the point that we think is the minimum for f(x)
Necessary condition (for optimality):
f(x*) = 0
A point that satisfies the necessary condition is a stationary
point
It can be a minimum, maximum, or saddle point
How do we know that we have a minimum?
Answer: Sufficiency Condition:
The sufficient conditions for x* to be a strict local minimum
are:
f(x*) = 0
2f(x*) is positive definite

Constrained Case KKT Conditions


To proof a claim of optimality in constrained
minimization (or maximization), we have to check the
found point with respect to the (Karesh) Kuhn Tucker
conditions.
Kuhn and Tucker extended the Lagrangian theory to
include the general classical single-objective nonlinear
programming problem:
Minimize:
f(x)
Subject to:
gj(x) 0
for j = 1, 2, ..., J
hk(x) = 0
for k = 1, 2, ..., K
x = ( x1, x2, ..., xN )

Interior versus Exterior Solutions


Interior:
If no constraints are active and (thus) the solution lies at the
interior of the feasible space, then the necessary condition for
optimality is same as for unconstrained case:
f(x*) = 0
Exterior:
If solution lies at the exterior, then the condition f(x*) = 0
does not apply because some constraints will block movement
to this minimum.
Some constraints will (thus) be active.
We cannot get any more improvement (in this case) if for x*
there does not exist a vector d that is both a descent direction
and a feasible direction.
In other words: the possible feasible directions do not
intersect the possible descent directions at all.

Necessary KKT Conditions


For the problem: Minimize objective function, f(x)
subjected to: gj(x) 0;

j = 1, 2, 3, . . . J

hk(x) = 0;

k = 1, 2, 3, . . . K

xi(L) xi xi(U) ;

i = 1, 2, 3, . . . N

This is the most general form of a single objective constrained


optimization problem.
Here gj(x) are inequality constraint functions (Total J in number);
hk(x) are equality constraint functions respectively (Total K).
A point is feasible if all constraints and bounds are satisfied.

Necessary KKT Conditions


For the problem:
Minimize: f(x)
subjected to: gj(x) 0;

j = 1, 2, 3, . . . J

hk(x) = 0;

k = 1, 2, 3, . . . K

xi(L) xi xi(U) ;

i = 1, 2, 3, . . . N

The necessary conditions are:


J

j =1

k =1

f ( x ) u j g j ( x ) v k h k ( x ) = 0 , (optimality)
gj(x) 0

for j= 1, 2, ..., J (feasibility)

hk(x) = 0

for k = 1, 2, ..., K

(complementary slackness

condition)
ujgj(x) = 0
uj(x) 0

for j= 1, 2, ..., J (non-negativity)


for j= 1, 2, ..., J

Necessary KKT Conditions (if g(x)0)


If the definition of feasibility changes, the optimality and
feasibility conditions change.
The necessary conditions become:
f(x) - ui gi(x) + vj hj(x) = 0 (optimality)
gj (x) 0
hk (x) = 0
uigi(x) = 0
ui 0

for j = 1, 2, ..., J (feasibility)


for k = 1, 2, ..., K (feasibility)
for i = 1, 2, ..., J (complementary slackness)
for i = 1, 2, ..., J (non-negativity)

Exercise 4.1.1
Let us take the following function to be minimized:
f(x) = (x2 + y 11)2 + (x + y2 7)2
Subjected to:

g1(x) =26 - (x - 5)2 y2 0,


g2(x) = 20 4x - y 0,
here every point in the search space is not feasible.
The feasible points are those that satisfy the above two
constraints and variable bounds.
Let us also choose four points x(1) =(1, 5)T, x(2) =(0, 0)T, x(3)
=(3, 2)T, and x(4) =(3.396, 0)Tto investigate whether each
point is a K-T point.
The feasible search space and these four points are shown on a
contour plot of the objective function in Figure 4.1.

Exercise 4.1.1
The region on the
other side of the
hatched portion of
a constraint line is
feasible.
The combination
of two constraints
and variable
bounds makes
the interior region
feasible, as
depicted in the
figure.

Exercise 4.1.1
At first, we transform the variable bounds to two
inequality constraints: g3(x) =x 0, and g4(y) =y 0.
Thus, the above problem has four inequality constraints
(J= 4) and no equality constraint (K =0).
There are two problem variables: N =2.
Thus, for each point a total of 2 + 3 x 4 +0 = 14 KuhnTucker conditions need to be checked.
To formulate all K-T conditions, we first calculate the
gradient of the objective function.
In Table 4.1, we compute these gradients numerically
and also compute the constraint values at all four points.

Exercise 4.1.1

For point (1, 5)T ,


g1(x, y) = 26 - (x - 5)2 y2 = 26 (1 5)2 52
= 26 16 25 = 15.0 ,
g2(x, y) = 20 4x - y = 20 4(1) 5 = 11.0
g3(x, y) = x = 1.0
g2(x, y) = y = 5.0

Now we have f(x) = (x2 + y 11)2 + (x + y2 7)2


fx = 4x3 + 4xy 42x + 2y2 14
fy = 2x2 + 4y3 + 4xy 26y 22

f (x, y) = ( fx , fy )T =
(4x3 + 4xy 42x + 2y2 14 , 2x2 + 4y3 + 4xy 26y 22 )T = (18, 370)T

Exercise 4.1.1

For point (1, 5)T ,


g1(x, y) = 26 - (x - 5)2 y2 = 26 (1 5)2 52
= 26 16 25 = 15.0 ,
( g1 )x = 2(x 5)

( g1 )y = 2y

g1(x, y) = ( g1x , g1y )T = ( -2x+10 , -2y )T = (8, -10)


g2 = 20 4x y ;

( g2 )x = 4 ; ( g2 )y = 1

g2(x, y) = ( g2x , g2y )T = ( -4 , -1 )T


g3 = x ;

( g3 )x = 1 ; ( g3 )y = 0

g3(x, y) = ( g3x , g3y )T = ( 1 , 0 )T


g4 = y ;

( g4 )x = 0 ; ( g4 )y = 1

g4(x, y) = ( g4x , g4y )T = ( 0 , 1 )T

Exercise 4.1.1

For the first point when we substitute the values we get following KKT
conditions:
J
K
f ( x ) u j g j ( x ) v k hk ( x ) = 0 ,
j =1

k =1

f x u1 ( g 1 )x u 2 ( g 2 )x u 3 ( g 3 )x u 4 ( g 4 )x v1 (h1 )x + = 0 ,
18 (8)u1 ( 4)u2 (1)u3 (0)u4 = 0 ;

f y u1 ( g1 )y u2 ( g 2 )y u3 ( g 3 )y u4 ( g 4 )y v1 (h1 )y + = 0,
370 (-10)u1 (1)u2 (0)u3 (1)u4 = 0 ;

( g1x , g1y )T = (8, -10)T


( g2x , g2y )T = ( -4 , -1 )T
( g3x , g3y )T = ( 1 , 0 )T
( g4x , g4y )T = ( 0 , 1 )T

Exercise 4.1.1

For the first point when we substitute the values we get following KKT
conditions:
J
K
f ( x ) u j g j ( x ) v k hk ( x ) = 0 ,
j =1

k =1

f x u1 ( g 1 )x u 2 ( g 2 )x u 3 ( g 3 )x u 4 ( g 4 )x v1 (h1 )x + = 0 ,
18 (8)u1 ( 4)u2 (1)u3 (0)u4 = 0 ;
gj(x) 0

for j=

1, 2, ..., J
hk(x) = 0
(-15)u
1 =
1, 2,
..., 0K

for k =
;

(11)u2 = 0 ;
(1)u3 = 0
;
(5)u4 = 0
u1 , u2 , u3 , u4 0

g1 = 15 not greater or equal to zero


g2 = +11 > 0
;
g3 = 1 > 0
;
g4 = 5 >0

ujgj(x) = 0
uj(x) 0
1, 2, ..., J

for j= 1, 2, ..., J
for j=

Exercise 4.1.1
It is clear that the third
condition is not satisfied that

is gj (x) 0.

This is enough to conclude


that the point x(1) is not a KT
point.
In fact, since the first
constraint value is negative,
this constraint is violated at
this point and the point x(1)
is not a feasible point, as
shown in Figure 4.1.
If a point is infeasible, the
point cannot be an optimal
point.

Exercise 4.1.1

Exercise 4.1.1
For the second point x(2) = (0, 0)T
we obtain the following conditions:
-14 -10u1 +4u2 -u3 = 0,
-22 +u2 -u4 = 0,
1 >0; 20 > 0;
0=0; 0 = 0,
(1)u1 =0 ;
(20)u2 = 0 ;
(0)u3 =0 ;
(0)u4 =0 ;

All but the final set of conditions reveal that


u1= 0, u2 = 0, u3 = -14 and u4 = -22.

Exercise 4.1.1
Since u1 = 0, u2 = 0, u3 =
-14 and u4 = -22.
The final set of conditions
are not satisfied with these
values, because u3 and u4
are negative.
Thus, the point x(2) is not a
K-T point.
Since the constraints are not
violated, the point is a
feasible point, as shown in
Figure 4.1.
Thus, the point x(2) cannot
be an optimal point.

Exercise 4.1.1
Similarly, the conditions for the third point x(3) = (3, 2)T are
following:
-4u1 + 4 u2 u3 = 0;
+4u1 + u2 u4 = 0;
18 > 0 ; 6 > 0 ; 3 > 0 ; 2 > 0,
(18)u1 =0,
(6)u2 = 0,
(3)u3 =0,
(2)u4 =0,
The vector u* = (0,0,0, 0)T satisfies all the above conditions.

Exercise 4.1.1
The vector u* = (0, 0,
0, 0)T satisfies all the
above conditions.
Thus, the point x(3) is a
K-T point (Figure 4.1).
As mentioned earlier, KT points are likely
candidates for minimal
points.
To conclude, we may
say that the optimality
of a point requires
satisfaction of more
conditions.

Exercise 4.1.1
The K-T conditions obtained for the point x(4) = (3.396, 0)T
are
-3.21u1 + 4u2 - u3 = 0,
1 +u2 - u4 = 0,
23.427 > 0; 6.416 > 0 ; 3.396 > 0; 0 = 0,
(23.427)(u1) = 0,
(6.416)u2 = 0,
(3.396)u3 =0,
(0)u4 = 0,
The solution to the above conditions is the vector u* = (0,0,0,1)T

Exercise 4.1.1
The solution to the above
conditions is the vector u* =
(0,0,0,1)T.
Thus, the point x(4) is also a
K-T point.
It is clear from the figure that
the point x(3) is the minimum
point, but the point x(4) is not
a minimum point.
Thus, we may conclude from
the above exercise problem
that a K-T point mayor may
not be a minimum point.
But if a point is not a K-T point
(point x(2) or x(3)), then it
cannot be an optimum point.

Restating the Optimization Problem


Kuhn Tucker Optimization Problem:
Find vectors x(Nx1), u(1xM) and v (1xK) that satisfy:

f(x) + ui gi(x) + vj hj(x) = 0 (optimality)


for i = 1, 2, ..., M (feasibility)
gi(x) 0
for j = 1, 2, ..., L (feasibility)
hj(x) = 0
ui gi(x) = 0 for i = 1, 2, ..., M (complementary slackness)
for i = 1, 2, ..., M (non-negativity)
ui 0
If x* is an optimal solution to NLP, then there exists a (u*, v*)
such that (x*, u*, v*) solves the KuhnTucker problem.
Above equations not only give the necessary conditions for
optimality, but also provide a way of finding the optimal point.

Limitations
Necessity theorem helps identify points that are not
optimal. A point is not optimal if it does not satisfy
the KuhnTucker conditions.
On the other hand, not all points that satisfy the
Kuhn-Tucker conditions are optimal points.
The KuhnTucker sufficiency theorem gives conditions
under which a point becomes an optimal solution to a
single-objective NLP.

Sufficiency Condition

Sufficient conditions that a point x* is a strict local minimum of the


classical single objective NLP problem, where f, gj, and hk are twice
differentiable functions are that
1) The necessary KKT conditions are met.
2) The Hessian matrix 2L(x*) = 2f(x*) + i2gi(x*) +
j2hj(x*) is positive definite on a subspace of Rn as defined by
the condition:

yT 2L(x*) y 0 is met for every vector y(1xN) satisfying:


gj(x*)y = 0 for j belonging to I1 = { j | gj(x*) = 0,
uj* > 0} (active constraints)
hk(x*)y = 0 for k = 1, ..., K and y 0

KKT Sufficiency Theorem (Special Case)


Consider the classical single objective NLP problem.
minimize :

f(x)

Subject to: gj(x) 0

for j = 1, 2, ..., J

hk(x) = 0

for k = 1, 2, ..., K

x = (x1, x2, ..., xN)


Let the objective function f(x) be convex, the inequality constraints
gj(x) be all convex functions for j = 1, ..., J, and the equality
constraints hk(x) for k = 1, ..., K be linear.
If this is true, then the necessary KKT conditions are also sufficient.
Therefore, in this case, if there exists a solution x* that satisfies the
KKT necessary conditions, then x* is an optimal solution to the NLP
problem.
In fact, it is a global optimum.

Some Remarks

Kuhn-Tucker Conditions are an extension of Lagrangian function


and method.

They provide powerful means to verify solutions

But there are limitations


Sufficiency conditions are difficult to verify.
Practical problems do not have required nice properties.
For example, You will have a problems if you do not know
the explicit constraint equations (e.g., in FEM).

If you have a multi-objective formulation, then we suggest


testing each priority level separately.

Das könnte Ihnen auch gefallen