Beruflich Dokumente
Kultur Dokumente
In mathematical optimization, the KarushKuhn a local optimum and the optimization problem satises
Tucker (KKT) conditions, also known as the Kuhn some regularity conditions (see below), then there exist
Tucker conditions, are rst-order necessary conditions constants i (i = 1, . . . , m) and j (j = 1, . . . , ) ,
for a solution in nonlinear programming to be optimal, called KKT multipliers, such that
provided that some regularity conditions are satised. Al-
lowing inequality constraints, the KKT approach to non-
linear programming generalizes the method of Lagrange
multipliers, which allows only equality constraints. The
system of equations and inequalities corresponding to the
KKT conditions is usually not solved directly, except in
the few special cases where a closed-form solution can
be derived analytically. In general, many optimization
algorithms can be interpreted as methods for numerically
solving the KKT system of equations and inequalities.[1]
The KKT conditions were originally named after Harold
W. Kuhn, and Albert W. Tucker, who rst published the
conditions in 1951.[2] Later scholars discovered that the
necessary conditions for this problem had been stated by
William Karush in his masters thesis in 1939.[3][4]
m
For minimizing f(x): f (x ) = i gi (x ) +
f (x)
i=1
j=1 j hj (x ),
subject to
Primal feasibility gi (x ) 0, for i = 1, . . . , m
gi (x) 0, hj (x ) = 0, for j = 1, . . . ,
hj (x) = 0,
Dual feasibility i 0, for i = 1, . . . , m
where x is the optimization variable, f is the objective
or utility function, gi (i = 1, . . . , m) are the inequality Complementary slackness g (x ) = 0, for i =
i i
constraint functions, and hj (j = 1, . . . , ) are the equal- 1, . . . , m.
ity constraint functions. The numbers of inequality and
equality constraints are denoted m and , respectively.
In the particular case m = 0 , i.e., when there are no
inequality constraints, the KKT conditions turn into the
2 Necessary conditions Lagrange conditions, and the KKT multipliers are called
Lagrange multipliers.
Suppose that the objective function f : Rn R and the If some of the functions are non-dierentiable,
constraint functions gi : Rn R and hj : Rn R subdierential versions of KarushKuhnTucker
are continuously dierentiable at a point x . If x is (KKT) conditions are available.[5]
1
2 5 ECONOMICS
9 References
Maximize f (x)
[1] Boyd, Stephen; Vandenberghe, Lieven (2004). Convex
to subject Optimization. Cambridge: Cambridge University Press.
p. 244. ISBN 0-521-83378-7. MR 2061575.
m
10 Further reading
0 f (x ) + i gi (x ) + j hj (x ) = 0,
i=1 j=1 Andreani, R.; Martnez, J. M.; Schuverdt, M. L.
(2005). On the relation between constant posi-
which are called the Fritz John conditions. tive linear dependence condition and quasinormal-
The KKT conditions belong to a wider class of the rst- ity constraint qualication. Journal of Optimiza-
order necessary conditions (FONC), which allow for non- tion Theory and Applications. 125 (2): 473485.
smooth functions using subderivatives. doi:10.1007/s10957-004-1861-9.
4 11 EXTERNAL LINKS
11 External links
KarushKuhnTucker conditions with derivation
and examples
12.2 Images
File:Inequality_constraint_diagram.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5d/Inequality_constraint_
diagram.svg License: CC BY-SA 3.0 Contributors: http://www.onmyphd.com/?p=kkt.karush.kuhn.tucker Original artist: Onmyphd