0% found this document useful (0 votes)
8 views26 pages

KKTgeometry

The document discusses the geometries of the Karush-Kuhn-Tucker (KKT) conditions for optimization problems with equality and inequality constraints. It explains that for problems with equality constraints, the gradients of the objective function and constraints must be parallel at optimal points. For problems with inequality constraints, the gradient of the objective function must lie in the cone between the gradients of the active constraints. The KKT conditions provide necessary conditions for optimality of constrained optimization problems.

Uploaded by

danu nalendra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views26 pages

KKTgeometry

The document discusses the geometries of the Karush-Kuhn-Tucker (KKT) conditions for optimization problems with equality and inequality constraints. It explains that for problems with equality constraints, the gradients of the objective function and constraints must be parallel at optimal points. For problems with inequality constraints, the gradient of the objective function must lie in the cone between the gradients of the active constraints. The KKT conditions provide necessary conditions for optimality of constrained optimization problems.

Uploaded by

danu nalendra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

MS&E 211

The Principles and Geometries of


KKT and Optimization

min f ( x1 , x2 ,..., x n)
s.t. ( x1 , x 2 ,..., xn)  X

December 1, 2005 1
MS&E 211

Geometries of KKT: Unconstrained

• Problem:
• Minimize f(x), where x is a vector that could have any values,
positive or negative
• First Order Necessary Condition (min or max):
• f(x) = 0 (∂f/∂xi = 0 for all i) is the first order necessary
condition for optimization
• Second Order Necessary Condition: ∂2f/∂xi2 > 0
• 2f(x) is positive semidefinite (PSD) ∂f/∂xi = 0
• [x• 2f(x) • x ≥ 0 for all x ] f
• Second Order Sufficient Condition
(Given FONC satisfied)
• 2f(x) is positive definite (PD)
• [ x • 2f(x) • x > 0 for all x ] xi
December 1, 2005 2
MS&E 211

Geometries of KKT: Equality


Constrained (one constraint)
• Problem:
• Minimize f(x), where x is a vector
• Subject to: h(x) = b
• First Order Necessary Condition for minimum (or for
maximum):
• f(x) =  h(x)
for some  free ( is a scalar)
h(x) = b

Two surfaces must be tangent

h(x) = b and -h(x) = -b are the same;


there is no sign restriction on 

December 1, 2005 3
MS&E 211

Geometries of KKT: Equality


Constrained (one constraint)

• First Order Necessary Condition:


• f(x) =  h(x) for some 
• Lagrangian:
• L(x, ) = f(x) -  [h(x)-b] ,
• Minimize L(x, ) over x and
Maximize L(x, ) over . Use principles of
unconstrained optimization
• L(x, ) = 0:
• xL(x, ) = f(x) -  h(x) = 0
• L(x, ) = h(x)-b = 0

December 1, 2005 4
MS&E 211

Geometries of KKT: Equality


Constrained (multiple constraints)
• Problem:
• Minimize f(x), where x is a vector
• Such that: hi(x) = bi for i = 1,2,…,m
• KKT Conditions (Necessary Conditions):
Exist i ,i = 1,2,…,m, such that
• f(x) = i=1n ihi(x)
• hi(x) = bi for i = 1,2,…,m
• Such a point (x, ) is called a KKT point, and
 is called the Dual Vector or the Lagrange
Multipliers. Furthermore, these conditions are
sufficient if f(x) is convex and hi(x), i =
1,2,…,m, are linear.
December 1, 2005 5
MS&E 211

Geometries of KKT: Unconstrained,


Except Non-Negativity Condition

• Problem:
• Minimize f(x), where x is a vector, x > 0
• First Order Necessary Condition:
• ∂f/∂xi = 0 if xi > 0 f ∂f/∂xi > 0 ∂f/∂xi = 0
• ∂f/∂xi ≥ 0 if xi = 0
• Thus: [∂f/∂xi]xi = 0 for all xi, or
f(x) • x = 0, f(x) ≥ 0
xi
• If interior point (x > 0), then f(x) = 0
• Nothing changes if the constraint is not binding

December 1, 2005 6
MS&E 211

Geometry of KKT: Inequality


Constrained (one constraint)
• Problem:
• Minimize f(x), where x is a vector
• Subject to: g(x) ≥ b.
• Assume feasible set and set of points
preferred to any point are all convex sets.
• (i.e. convex program)
• First Order Necessary Condition:
• f(x) =  g(x) for some  > 0 ( is a scalar)
• If constraint is binding [g(x) = b], then  ≥ 0
• If constraint is none-binding [g(x) > b], then
f(x) =0 or  = 0

December 1, 2005 7
MS&E 211

Geometries of KKT: Inequality


Constrained (one constraint)
• For any point x’ on the frontier of the feasible region
of g(x) ≥ b, recall that -g(x’) is the direction of
steepest descent of g(x) at x’.
• It is also perpendicular to the frontier of g(x’) = b,
pointing in the direction of decreasing g(x).
• Thus -g(x) is perpendicular to the tangent hyperplane of
g(x) = b at x.
x2 -g(x’)

g (x) ≥ b
x1
December 1, 2005 8
MS&E 211

Geometries of KKT: Inequality


Constrained (one constraint)
• f(x’) is similarly a vector perpendicular to the level
set of f(x’) evaluated at x’:
• Say f(x’) = c’. -f(x’) is a vector pointed in direction
of decreasing value of f(x).
• Also, -f(x’) is perpendicular to the tangent
hyperplane of f(x) = c’ at x’.
x2 f(x) = c’ (constant)

-f(x’)

x1
December 1, 2005 9
MS&E 211

Geometries of KKT: Inequality


Constrained (one constraint)

• First Order Necessary Condition:


• f(x) =  g(x) for some  > 0 ( is a scalar)
• If constraint is binding [g(x) = b] then  > 0

f(x) constant
x2 -g(x)

-g(x) is At optimum -g(x) and -f(x)


must be parallel: two surfaces
perpendicular to
must be tangent
frontier: g(x) = b
-f(x) is perpendicular
g(x) ≥ b to f(x) constant

December 1, 2005 x1 10
MS&E 211

Geometries of KKT: Inequality


Constrained (one constraint)
f(x) constant
x2
-g(x)

-f(x)

g(x) ≥ b

x1
If -g(x) and -f(x) are not parallel, there are feasible
points with less f(x).

December 1, 2005 11
MS&E 211

Geometries of KKT: Inequality


Constrained (one constraint)

x2 f(x) constant

-f(x)
g(x) ≥ b
-g(x)

x1
If -g(x) and -f(x) are parallel but in opposite
direction, there are feasible points with less f(x).

December 1, 2005 12
MS&E 211

Geometries of KKT: Inequality


Constrained (one constraint)
First Order Necessary Condition:
– f(x) = 0 if constraint is not binding [g(x) >
b]

X2 f(x) decreases toward sink at the


middle.

At optimal point, f(x) = 0


This can be sees as an unconstrained
optimum.

December 1, 2005 X1 13
MS&E 211

Geometries of KKT: Inequality


Constrained (one constraint)

• First Order Necessary Condition:


• f(x) =  g(x) for some  > 0
• If constraint is non-binding [g(x) > 0] then  = 0
• Lagrangian:
• L(x, ) = f(x) -  [g(x)-b] , s.t.  > 0
• Minimize L(x, ) over x and Maximize L(x, ) over
. Use principles of unconstrained optimization
• xL(x, ) = f(x) -  g(x) = 0
• g(x)-b > 0, then =0.

December 1, 2005 14
MS&E 211

Geometries of KKT: Inequality


Constrained (one constraint)
• Problem:
• Mimimize f(x), where x is a vector
• Subject to: g(x) ≥ b
• Equivalently:
• f(x) =  g(x)
• g(x) ≥ b >0
•  [g(x) – b] = 0

December 1, 2005 15
MS&E 211

Geometries of KKT: Inequality


Constrained (two constraints)
• Problem:
• Minimize f(x), where x is a vector
• Subject to: g1(x) ≥ b1 and g2(x) ≥ b2
• First Order Necessary Conditions:
•  f(x) = 1 g1(x) + 2 g2(x), 1 ≥ 0, 2 ≥ 0
•  f(x) lies in the cone between g1(x) and g2(x)
• g1(x) > b1 1 = 0
• g2(x) > b2 2 = 0
• 1 [g1(x) - b1] = 0 x2 -g1(x)
• 2 [g2(x) - b2] = 0
-f(x)
• Shaded area is feasible -g2(x)
set with two constraints

x1
Both constraints are binding
December 1, 2005 16
MS&E 211

Geometries of KKT: Inequality


Constrained (two constraints)
• Problem:
• Minimize f(x), where x is a vector
• Subject to: g1(x) ≥ b1 and g2(x) ≥ b2
• First Order Necessary Conditions:
•  f(x) = 1 g1(x), 1 ≥ 0
• g2(x) > b2 2 = 0 -f(x)
• g1(x) - b1 = 0 x2 -g1(x)
• Shaded area is feasible
set with two constraints
x1
First constraint is binding
December 1, 2005 17
MS&E 211

Geometries of KKT: Inequality


Constrained (two constraints)
• Problem:
• Minimize f(x), where x is a vector
• Subject to: g1(x) ≥ b1 and g2(x) ≥ b2
• First Order Necessary Conditions:
•  f(x) = 0
• g1(x) > b1 1 = 0
• g2(x) > b2 2 = 0 x2
• Shaded area is feasible f(x)=0
set with two constraints
x1
None constraint is binding
December 1, 2005 18
MS&E 211

Geometries of KKT: Inequality


Constrained (two constraints)

• Lagrangian:
• L(x, 1, 2) = f(x) - 1 [g1(x)-b1] - 2 [g2(x)-b2]
• Minimize L(x, 1, 2) over x.
• Use principles of unconstrained maximization
• L(x, 1, 2) = 0 (gradient with respect to x only)
• L(x, 1, 2) = f(x) - 1 g1(x) - 2 g2(x) = 0
• Thus f(x) = 1 g1(x) + 2 g2(x)
• Maximize L(x, 1, 2) over 1≥ 0, 2 ≥ 0.
• g1(x)-b1 > 0, then 1=0
• g2(x)-b2 > 0, then 2=0

December 1, 2005 19
MS&E 211

KKT: Inequality Constrained (multiple


constraints)

min f ( x1 , x 2 ,..., x n)
s.t. g1 ( x1 , x 2 ,..., x n)  b1
g 2 ( x1 , x 2 ,..., x n)  b2
 
g m ( x1 , x 2 ,..., x n)  bm

December 1, 2005 20
MS&E 211

KKT Conditions: Inequality Case


The Karush-Kuhn-Tucker Theorem: If the function
f(x) has a minimum at x* in the feasible set and if
f(x*) and gi(x*), i=1,2,…,m, exist, then there is an
m-dimensional vector  such that
>0
f(x*)-i=1m igi(x*) = 0
i [gi(x*) - bi] = 0, for i=1,2,…,m.

Such a point (x*, ) is called a KKT point, and  is


called the Dual Vector or the Lagrange Multipliers.
Furthermore, these conditions are sufficient if (as we
have assumed here) we are dealing with a convex
programming problem

December 1, 2005 21
MS&E 211

Example: KKT Conditions


min x1 + x2
2 2

s.t.
- 0.25  ( x1 − 2) 2 − ( x2 − 2) 2  − 1

KKT Conditions
 2 x1   0.5( 2 − x1 ) 
  −     = 0
 2 x2   2( 2 − x 2 ) 
- 0.25  ( x1 − 2) 2 − ( x2 − 2) 2  − 1
0
[1 − 0.25  ( x1 − 2) 2 − ( x2 − 2) 2 ] = 0
December 1, 2005 22
MS&E 211

Example: KKT Conditions

The curve (surface) of the


objective function is
tangential to the
constraint curve (surface)
at the optimal point.

-f(x)
g(x)
December 1, 2005 23
MS&E 211

Example: Computation of the KKT


Condition
 2 x1   0.5(2 − x1 ) 
  −     = 0
 2 x2   2(2 − x2 ) 
2 2
x1 = ; x2 =
4+ 1+ 
• If  = 0, then x1 = 0 and x2 = 0, and thus the
constraint would not hold with equality.
Therefore,  must be positive.
• Plugging the two values of x1() and x2() into
the constraint with equality gives us  = 1.8.
• We can then solve for x1 = .61 and x2 = 1.28.
December 1, 2005 24
MS&E 211

Economical Interpretation of
Lagrange Multipliers

• As with LPs, there is actually a whole


area of duality theory that corresponds
to NLPs.
• In this vein, we can view Lagrangians
as shadow prices for the constraints in
NLP (corresponding to the y vector in
LP)

December 1, 2005 25
MS&E 211

KKT Conditions: Final Notes

• KKT conditions may not lead directly to a very


efficient algorithm for solving NLPs.
However, they do have a number of benefits:
• They give insight into what optimal solutions to
NLPs look like
• They provide a way to set up and solve small
problems
• They provide a method to check solutions to large
problems
• The Lagrange multipliers can be seen as shadow
prices of the constraints

December 1, 2005 26

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy