Sie sind auf Seite 1von 19

Convex Optimization Boyd & Vandenberghe

11. Equality constrained minimization

equality constrained minimization eliminating equality constraints Newtons method with equality constraints infeasible start Newton method implementation

111

Equality constrained minimization


minimize f (x) subject to Ax = b f convex, twice continuously dierentiable A Rpn with rank A = p we assume p is nite and attained

optimality conditions: x is optimal i there exists a such that f (x) + AT = 0, Ax = b

Equality constrained minimization

112

equality constrained quadratic minimization (with P Sn ) + minimize (1/2)xT P x + q T x + r subject to Ax = b optimality condition: P AT A 0 x = q b

coecient matrix is called KKT matrix KKT matrix is nonsingular if and only if Ax = 0, x=0 = xT P x > 0

equivalent condition for nonsingularity: P + AT A 0


Equality constrained minimization 113

Eliminating equality constraints


represent solution of {x | Ax = b} as {x | Ax = b} = {F z + x | z Rnp} x is (any) particular solution range of F Rn(np) is nullspace of A (rank F = n p and AF = 0) reduced or eliminated problem minimize f (F z + x) an unconstrained problem with variable z Rnp from solution z , obtain x and as x = F z + x, = (AAT )1Af (x)

Equality constrained minimization

114

example: optimal allocation with resource constraint minimize f1(x1) + f2(x2) + + fn(xn) subject to x1 + x2 + + xn = b

eliminate xn = b x1 xn1, i.e., choose x = ben, F = I 1T Rn(n1)

reduced problem: minimize f1(x1) + + fn1(xn1) + fn(b x1 xn1) (variables x1, . . . , xn1)

Equality constrained minimization

115

Newton step
Newton step xnt of f at feasible x is given by solution v of 2f (x) AT A 0 interpretations xnt solves second order approximation (with variable v) minimize f (x + v) = f (x) + f (x)T v + (1/2)v T 2f (x)v subject to A(x + v) = b xnt equations follow from linearizing optimality conditions f (x + v) + AT w f (x) + 2f (x)v + AT w = 0, A(x + v) = b v w = f (x) 0

Equality constrained minimization

116

Newton decrement
(x) = xT 2f (x)xnt nt properties gives an estimate of f (x) p using quadratic approximation f : 1 f (x) inf f (y) = (x)2 Ay=b 2 directional derivative in Newton direction: d f (x + txnt) dt = (x)2
t=0 1/2

= f (x)T xnt

1/2

in general, (x) = f (x)T 2f (x)1f (x)


Equality constrained minimization

1/2

117

Newtons method with equality constraints

given starting point x dom f with Ax = b, tolerance > 0. repeat 1. Compute the Newton step and decrement xnt, (x). 2. Stopping criterion. quit if 2/2 . 3. Line search. Choose step size t by backtracking line search. 4. Update. x := x + txnt.

a feasible descent method: x(k) feasible and f (x(k+1)) < f (x(k)) ane invariant

Equality constrained minimization

118

Newtons method and elimination


Newtons method for reduced problem minimize f (z) = f (F z + x) variables z Rnp x satises A = b; rank F = n p and AF = 0 x Newtons method for f , started at z (0), generates iterates z (k) Newtons method with equality constraints when started at x(0) = F z (0) + x, iterates are x(k+1) = F z (k) + x hence, dont need separate convergence analysis
Equality constrained minimization 119

Newton step at infeasible points


2nd interpretation of page 116 extends to infeasible x (i.e., Ax = b) linearizing optimality conditions at infeasible x (with x dom f ) gives 2f (x) AT A 0 primal-dual interpretation write optimality condition as r(y) = 0, where y = (x, ), r(y) = (f (x) + AT , Ax b) xnt w = f (x) Ax b (1)

linearizing r(y) = 0 gives r(y + y) r(y) + Dr(y)y = 0: 2f (x) AT A 0 xnt nt = f (x) + AT Ax b

same as (1) with w = + nt


Equality constrained minimization 1110

Infeasible start Newton method

given starting point x dom f , , tolerance > 0, (0, 1/2), (0, 1). repeat 1. Compute primal and dual Newton steps xnt, nt. 2. Backtracking line search on r 2. t := 1. while r(x + txnt, + tnt) 2 > (1 t) r(x, ) 2, t := t. 3. Update. x := x + txnt, := + tnt. until Ax = b and r(x, ) 2 .

not a descent method: f (x(k+1)) > f (x(k)) is possible directional derivative of r(y)
2

in direction y = (xnt, nt) is


2 t=0

d r(y + ty) dt

= r(y)

Equality constrained minimization

1111

Solving KKT systems


H A solution methods LDLT factorization elimination (if H nonsingular) AH 1AT w = h AH 1g, elimination with singular H: write as H + AT QA AT A 0 with Q v w = g + AT Qh h Hv = (g + AT w) AT 0 v w = g h

0 for which H + AT QA 0, and apply elimination


1112

Equality constrained minimization

Equality constrained analytic centering


primal problem: minimize
n i=1 log xi

subject to Ax = b +n

dual problem: maximize bT +

n T i=1 log(A )i

three methods for an example with A R100500, dierent starting points 1. Newton method with equality constraints (requires x(0) 0, Ax(0) = b)
105

f (x(k)) p

100 105 10100

10

15

20

Equality constrained minimization

1113

2. Newton method applied to dual problem (requires AT (0) 0)


105

p g( (k))

100 105 10100

10

3. infeasible start Newton method (requires x(0) 0)


1010

r(x(k), (k))

105 100 105 1010 10150 5 10 15 20 25


1114

Equality constrained minimization

complexity per iteration of three methods is identical 1. use block elimination to solve KKT system diag(x)2 AT A 0 x w = diag(x)11 0

reduces to solving A diag(x)2AT w = b 2. solve Newton system A diag(AT )2AT = b + A diag(AT )11 3. use block elimination to solve KKT system diag(x)2 AT A 0 x = diag(x)11 Ax b

reduces to solving A diag(x)2AT w = 2Ax b conclusion: in each case, solve ADAT w = h with D positive diagonal
Equality constrained minimization 1115

Network ow optimization
minimize i=1 i (xi ) subject to Ax = b directed graph with n arcs, p + 1 nodes xi: ow through arc i; i: cost ow function for arc i (with (x) > 0) i node-incidence matrix A R(p+1)n dened as 1 arc j leaves node i 1 arc j enters node i Aij = 0 otherwise reduced node-incidence matrix A Rpn is A with last row removed b Rp is (reduced) source vector rank A = p if graph is connected
Equality constrained minimization 1116

KKT system H A AT 0 v w g h

H = diag((x1), . . . , (xn)), positive diagonal n 1 solve via elimination: AH 1AT w = h AH 1g, Hv = (g + AT w)

sparsity pattern of coecient matrix is given by graph connectivity (AH 1AT )ij = 0 (AAT )ij = 0 nodes i and j are connected by an arc

Equality constrained minimization

1117

Analytic center of linear matrix inequality


minimize log det X subject to tr(AiX) = bi, variable X Sn optimality conditions
p

i = 1, . . . , p

X 0,

(X )1 +
j=1

j Ai = 0,

tr(AiX ) = bi,

i = 1, . . . , p

Newton equation at feasible X:


p

X 1XX 1 +
j=1

wj Ai = X 1,

tr(AiX) = 0,

i = 1, . . . , p

follows from linear approximation (X + X)1 X 1 X 1XX 1 n(n + 1)/2 + p variables X, w


Equality constrained minimization 1118

solution by block elimination eliminate X from rst equation: X = X substitute X in second equation
p p j=1 wj XAj X

tr(AiXAj X)wj = bi,


j=1

i = 1, . . . , p

(2)

a dense positive denite set of linear equations with variable w Rp op count (dominant terms) using Cholesky factorization X = LLT : form p products LT Aj L: (3/2)pn3 form p(p + 1)/2 inner products tr((LT AiL)(LT Aj L)): (1/2)p2n2 solve (2) via Cholesky factorization: (1/3)p3

Equality constrained minimization

1119

Das könnte Ihnen auch gefallen