Sie sind auf Seite 1von 6

Institute Automation & Industrial IT

Master AUT & IT


Fachhochschule Köln
Cologne University of Applied Sciences

Advance Process Control and Optimization

Prof. Jürgen Boehm Rietig

Assignment On

Non-Linear Optimization
Using
KKT

By:
Ashwin Kumar Venkatesan
Optimization
Assignment 2

Problem Statement:

𝑓 𝑥 = −𝑥1 → 𝑚𝑖𝑛𝑖𝑚𝑢𝑚
𝑔1 𝑥 = 𝑥12 + 𝑥2 − 1 ≤ 0
𝑔2 𝑥 = 𝑥12 − 𝑥2 − 1 ≤ 0
a)
𝑥2 = −𝑥12 + 1
𝑥2 = 𝑥12 + 1

Ashwin Kumar Venkatesan-11079683 Page 2


Optimization
Assignment 2

b) To prove convexity we find the Hessian matrix.


Since
0 0 2 0 2 0
𝑓= − 𝑙𝑖𝑛𝑒𝑎𝑟; 𝑔1 ; 𝑔2 = => +𝑣𝑒 𝑑𝑒𝑓𝑖𝑛𝑖𝑡𝑒
0 0 0 0 0 0

Therefore this is a convex optimization problem.

c)

𝑓 + 𝛽1 ∇𝑔1 + 𝛽2 ∇𝑔2 = 0

With Respect to x1
−1 + 𝛽1 ∗ 2𝑥1 + 𝛽2 ∗ 2𝑥1 = 0

𝛽1 𝑥1 + 𝛽2 𝑥1 = 1/2 … … … … … … … … … … (1)
with Respect to x2

𝛽1 − 𝛽2 = 0 => 𝛽1 = 𝛽2 … … … … … … . (2)

𝛽1 ∗ 𝑔1 𝑥1 , 𝑥2 = 0 … … … … … … … … … (3)

𝛽2 ∗ 𝑔2 𝑥1 , 𝑥2 = 0 … … … … … … … . . … (4)

There are four cases:

1) 𝛽1 = 𝛽2 = 0 𝑎𝑛𝑑 𝑔1 , 𝑔2 > 0

2) 𝛽1 = 0; 𝛽2 , 𝑔1 > 0

3) 𝛽2 = 0; 𝛽1 , 𝑔2 > 0

4) 𝛽1 , 𝛽2 > 0 𝑎𝑛𝑑 𝑔1 = 𝑔2 = 0

CASE 1:

1) 𝛽1 = 𝛽2 = 0 𝑎𝑛𝑑 𝑔1 , 𝑔2 > 0
eq(1)=>0=1
eq(2)=>0=0
eq(3)=>No solution
eq(4)=>No solution

Ashwin Kumar Venkatesan-11079683 Page 3


Optimization
Assignment 2

CASE 2:

2) 𝛽1 = 0; 𝛽2 , 𝑔1 > 0

Cannot happen

CASE 3:

3) 𝛽2 = 0; 𝛽1 , 𝑔2 > 0

Cannot happen

CASE 4:

4) 𝛽1 , 𝛽2 > 0 𝑎𝑛𝑑 𝑔1 = 𝑔2 = 0

eq(1)=> 𝛽1 𝑥1 + 𝛽2 𝑥1 = 1/2 … … … … … … … … … (5)


eq(2)=> 𝛽1 = 𝛽2 … … … … … … … … … … … … … (6)
eq(3)=> 𝑥12 + 𝑥2 = 1 … … … … … … … … … … … (7)
eq(4)=> 𝑥12 − 𝑥2 = 1 … … … … … … … … … … … 8

(7)+(8)=> 2𝑥12 = 2 => 𝑥12 = 1 => 𝑥1 = ±1

𝐹𝑜𝑟 𝑥1 = ±1 => 𝑥2 = 0

1
(5)=> 2𝛽1 𝑥1 = 1 => 𝛽1 =
4𝑥 1

(6)=> 𝛽1 = 𝛽2 = ±0.25

Ashwin Kumar Venkatesan-11079683 Page 4


Optimization
Assignment 2

𝑓 𝑥 = 𝑥12 + 𝑥22 → 𝑚𝑖𝑛𝑖𝑚𝑢𝑚


𝑔1 𝑥 = 1 − 𝑥1 ≤ 0
𝑔2 𝑥 = 1 − 𝑥12 − 𝑥22 ≤ 0

KKT method:
𝑓 + 𝛽1 ∇𝑔1 + 𝛽2 ∇𝑔2 = 0

𝛽1 ∗ 𝑔1 𝑥1 , 𝑥2 = 0

𝛽2 ∗ 𝑔2 𝑥1 , 𝑥2 = 0

According to KKT necessary conditions are not sufficient and additional


conditions are necessary. Sometimes they can be sufficient for optimality if the
objective function and the two constraints must be convex function.

To prove convexity we find the Hessian matrix. In the above problem the hessian
matrix of f is positive definite, g1 is either positive definite but g2 is negative
definite.

Since
2 0 0 0 −2 0
𝑓= ; 𝑔1 ; 𝑔2 = => −𝑣𝑒 𝑑𝑒𝑓𝑖𝑛𝑖𝑡𝑒
0 2 0 0 0 −2

Therefore more necessary conditions are required for optimization.

Ashwin Kumar Venkatesan-11079683 Page 5


Optimization
Assignment 2

But graphically we can say that the solution is possible.

Ashwin Kumar Venkatesan-11079683 Page 6

Das könnte Ihnen auch gefallen