0% found this document useful (0 votes)
54 views28 pages

MMEcon Handouts 15 Lagrange - Function

The document discusses using Lagrange multipliers to find the extrema of an objective function subject to a constraint. It introduces the Lagrange function and shows how to find critical points by setting partial derivatives equal to zero. The determinant of the bordered Hessian matrix is used to determine if critical points are maxima or minima.

Uploaded by

AKS AKS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views28 pages

MMEcon Handouts 15 Lagrange - Function

The document discusses using Lagrange multipliers to find the extrema of an objective function subject to a constraint. It introduces the Lagrange function and shows how to find critical points by setting partial derivatives equal to zero. The determinant of the bordered Hessian matrix is used to determine if critical points are maxima or minima.

Uploaded by

AKS AKS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Chapter 15

Lagrange Function

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 1 / 28


Constraint Optimization

Find the extrema of function

f ( x, y)

subject to
g( x, y) = c

Example:
Find the extrema of function

f ( x, y) = x2 + 2 y2

subject to
g( x, y) = x + y = 3

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 2 / 28


Graphical Solution

For the case of two variables we can find a solution graphically.

1. Draw the constraint g( x, y) = c in the xy-plain.


(The feasible region is a curve in the plane)

2. Draw appropriate contour lines of objective function f ( x, y).

3. Investigate which contour lines of the objective function intersect


with the feasible region.
Estimate the (approximate) location of the extrema.

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 3 / 28


Example – Graphical Solution

∇g minimum in (2, 1)
1

∇f ∇ f = λ∇ g

2
x+y = 3

Extrema of f ( x, y) = x2 + 2 y2 subject to g( x, y) = x + y = 3

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 4 / 28


Lagrange Approach

Let x∗ be an extremum of f ( x, y) subject to g( x, y) = c.


Then ∇ f (x∗) and ∇ g(x∗) are proportional, i.e.,
∇ f (x∗) = λ ∇ g(x∗)
where λ is some proportionality factor.

f x (x∗) = λ gx (x∗)
f y (x∗) = λ gy (x∗) ∇g
g(x∗) = c x∗

Transformation yields ∇f
f x (x∗) − λ gx (x∗) = 0
f y (x∗) − λ gy (x∗) = 0
c − g(x∗) =0
The l.h.s. is the gradient of L( x, y; λ) = f ( x, y) + λ (c − g( x, y)) .

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 5 / 28


Lagrange Function

We create a new function from f , g and an auxiliary variable λ,


called Lagrange function:

L( x, y; λ) = f ( x, y) + λ (c − g( x, y))

Auxiliary variable λ is called Lagrange multiplier.

Local extrema of f subject to g( x, y) = c are critical points of


Lagrange function L:

L x = f x − λ gx = 0
L y = f y − λ gy = 0
Lλ = c − g( x, y) = 0

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 6 / 28


Example – Lagrange Function

Compute the local extrema of

f ( x, y) = x2 + 2y2 subject to g( x, y) = x + y = 3

Lagrange function:

L( x, y, λ) = ( x2 + 2y2 ) + λ(3 − ( x + y))

Critical points:
L x = 2x − λ =0
Ly = 4y − λ =0
Lλ = 3 − x − y =0

⇒ unique critical point: (x0 ; λ0 ) = (2, 1; 4)

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 7 / 28


Bordered Hessian Matrix

Matrix
 
0 gx gy
 
H̄(x; λ) =  gx L xx L xy 
gy Lyx Lyy

is called the bordered Hessian Matrix.

Sufficient condition for local extremum:

Let (x0 ; λ0 ) be a critical point of L.


I |H̄(x0 ; λ0 )| > 0 ⇒ x0 is a local maximum
I |H̄(x0 ; λ0 )| < 0 ⇒ x0 is a local minimum
I |H̄(x0 ; λ0 )| = 0 ⇒ no conclusion possible

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 8 / 28


Example – Bordered Hessian Matrix

Compute the local extrema of

f ( x, y) = x2 + 2y2 subject to g( x, y) = x + y = 3

Lagrange function: L( x, y, λ) = ( x2 + 2y2 ) + λ(3 − x − y)


Critical point: (x0 ; λ0 ) = (2, 1; 4)
Determinant of the bordered Hessian:

0 gx gy 0 1 1
|H̄(x0 ; λ0 )| = gx L xx L xy = 1 2 0 = −6 < 0
gy Lyx Lyy 1 0 4

⇒ x0 = (2, 1) is a local minimum.

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 9 / 28


Many Variables and Constraints

Objective function
f ( x1 , . . . , x n )
and constraints
g1 ( x 1 , . . . , x n ) = c 1
..
. (k < n)
gk ( x1 , . . . , x n ) = c k

Optimization problem: min / max f (x) subject to g(x) = c.

Lagrange Function:

L(x; λ) = f (x) + λT (c − g(x))

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 10 / 28


Recipe – Critical Points

1. Create Lagrange Function L:

L( x1 , . . . , xn ; λ1 , . . . , λk )
k
= f ( x1 , . . . , xn ) + ∑ λi (ci − gi ( x1 , . . . , xn ))
i =1

2. Compute all first partial derivatives of L.

3. We get a system of n + k equations in n + k unknowns.


Find all solutions.

4. The first n components ( x1 , . . . , xn ) are the elements of the critical


points.

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 11 / 28


Example – Critical Points

Compute all critical points of

f ( x1 , x2 , x3 ) = ( x1 − 1)2 + ( x2 − 2)2 + 2 x32

subject to
x1 + 2 x2 = 2 and x2 − x3 = 3

Lagrange function:
L( x1 , x2 , x3 ; λ1 , λ2 ) = (( x1 − 1)2 + ( x2 − 2)2 + 2 x32 )
+ λ1 (2 − x1 − 2 x2 ) + λ2 (3 − x2 + x3 )

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 12 / 28


Example – Critical Points

Partial derivatives (gradient):

L x1 = 2 ( x1 − 1) − λ1 =0
L x2 = 2 ( x2 − 2) − 2 λ1 − λ2 =0
L x3 = 4 x3 + λ2 =0
L λ1 = 2 − x1 − 2 x2 =0
L λ2 = 3 − x2 + x3 =0
We get the critical points of L by solving this system of equations.
x1 = − 76 , x2 = 10
7, x3 = − 11
7 ; λ 1 = − 26
7 , λ2 =
44
7 .

The unique critical point of f subject to these constraints is


x0 = (− 67 , 10
7 , − 11
7 ).

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 13 / 28


Bordered Hessian Matrix

 ∂g1 ∂g1

0 ... 0 ∂x1 ... ∂xn
 . .. .. .. .. .. 
 .. . . . . . 
 
 ∂gk ∂gk 
 0 ... 0 ... 
H̄(x; λ) = 
 ∂g1
∂x1 ∂xn 
. . . L x1 x n 
∂gk
 ∂x1 ... ∂x1 L x1 x1 
 . .. .. .. .. .. 
 . . . 
 . . . . 
∂g1 ∂gk
∂xn ... ∂xn L x n x1 . . . L xn xn

For r = k + 1, . . . , n
let Br (x; λ) denote the (k + r )-th leading principle minor of H̄(x; λ).

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 14 / 28


Sufficient Condition for Local Extrema

Assume that (x0 ; λ0 ) is a critical point of L. Then

I (−1)k Br (x0 ; λ0 ) > 0 for all r = k + 1, . . . , n


⇒ x0 is a local minimum
I (−1)r Br (x0 ; λ0 ) > 0 for all r = k + 1, . . . , n
⇒ x0 is a local maximum

(n is the number of variables xi and k is the number of constraints.)

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 15 / 28


Example – Sufficient Condition for Local Extrema

Compute all extrema of f ( x1 , x2 , x3 ) = ( x1 − 1)2 + ( x2 − 2)2 + 2 x32


subject to constraints x1 + 2 x2 = 2 and x2 − x3 = 3

Lagrange Function:
L( x1 , x2 , x3 ; λ1 , λ2 ) = (( x1 − 1)2 + ( x2 − 2)2 + 2 x32 )
+ λ1 (2 − x1 − 2 x2 ) + λ2 (3 − x2 + x3 )

Critical point of L:
x1 = − 76 , x2 = 10
7, x3 = − 11
7 ; λ 1 = − 26
7 , λ2 =
44
7 .

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 16 / 28


Example – Sufficient Condition for Local Extrema

Bordered Hessian matrix:


 
0 0 1 2 0
 
0 0 0 1 −1
 
H̄(x; λ) = 
1 0 2 0 0 

 
2 1 0 2 0 
0 −1 0 0 4

3 variables, 2 constraints: n = 3, k = 2 ⇒ r=3


B3 = |H̄(x; λ)| = 14
(−1)k Br = (−1)2 B3 = 14 > 0 condition satisfied
(−1)r Br = (−1)3 B3 = −14 < 0 not satisfied

Critical point x0 = (− 67 , 10
7 , − 11
7 ) is a local minimum.

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 17 / 28


Sufficient Condition for Global Extrema

Let (x∗, λ∗) be a critical point of the Lagrange function L of optimization


problem
min / max f (x) subject to g(x) = c
If L(x, λ∗) is concave (convex) in x, then x∗ is a global maximum
(global minimum) of f (x) subject to g(x) = c.

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 18 / 28


Example – Sufficient Condition for Global Extrema

( x ∗, y∗; λ∗) = (2, 1; 4) is a critical point of the Lagrange function L of


optimization problem
min / max f ( x, y) = x2 + 2y2 subject to g( x, y) = x + y = 3

Lagrange function:
L( x, y, λ∗) = ( x2 + 2y2 ) + 4 · (3 − ( x + y))
Hessian matrix:
!
2 0 H1 = 2 >0
HL ( x, y) =
0 4 H2 = 8 >0
L is convex in ( x, y).

Thus ( x ∗, y∗) = (2, 1) is a global minimum.

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 19 / 28


Example – Sufficient Condition for Global Extrema

(x∗; λ∗) = (− 76 , 10
7 , − 11
7 ; − 26 44
7 , 7 )
is a critical point of the Lagrange function of optimization problem
min / max f ( x1 , x2 , x3 ) = ( x1 − 1)2 + ( x2 − 2)2 + 2 x32
subject to g1 ( x 1 , x 2 , x 3 ) = x 1 + 2 x 2 = 2
g2 ( x 1 , x 2 , x 3 ) = x 2 − x 3 = 3

Lagrange function:
L(x; λ∗) = (( x1 − 1)2 + ( x2 − 2)2 + 2 x32 )
− 267 (2 − x1 − 2 x2 ) +
44
7 (3 − x2 + x3 )

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 20 / 28


Example – Sufficient Condition for Global Extrema

Hessian matrix:
 
2 0 0 H1 = 2 >0
 
H L ( x1 , x2 , x3 ) = 0 2 0 H2 = 4 >0
0 0 4 H3 = 16 >0

L is convex in x.

x∗ = (− 67 , 10
7 , − 11
7 ) is a global minimum.

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 21 / 28


Interpretation of Lagrange Multiplier

Extremum x∗ of optimization problem

min / max f (x) subject to g(x) = c

depends on c, x∗ = x∗(c), and so does the extremal value

f ∗(c) = f (x∗(c))

How does f ∗(c) change with varying c?

∂f∗
(c) = λ∗j (c)
∂c j

That is, Lagrange multiplier λ j is the derivative of the extremal value


w.r.t. exogeneous variable c j in constraint g j (x) = c j .

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 22 / 28


Proof Idea

Lagrange function L and objective function f coincide in extemum x∗ .

∂ f ∗(c) ∂L(x∗(c), λ(c))


= [ chain rule ]
∂c j ∂c j
n ∗ (c)
∂x ∂L(x, c)
= ∑ ∗
L xi (x (c), λ(c)) · i
∂c j
+
∂c j
i =1 | {z } (x∗(c),λ∗(c))
=0
as x∗ is a critical point

∂L(x, c)
=
∂c j (x∗(c),λ∗(c))
∂  k 
= f (x) + ∑ λi (ci − gi (x))
∂c j i =1 (x∗(c),λ∗(c))
= λ∗j (c)

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 23 / 28


Example – Lagrange Multiplier

( x ∗, y∗) = (2, 1) is a minimum of optimization problem


min / max f ( x, y) = x2 + 2y2
subject to g( x, y) = x + y = c = 3
with λ∗ = 4.

How does the minimal value f ∗(c) change with varying c?

df∗
= λ∗ = 4
dc

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 24 / 28


Envelope Theorem

What is the derivative of the extremal value f ∗ of optimization problem

min / max f (x, p) subject to g(x, p) = c

w.r.t. parameters (exogeneous variables) p?

∂ f ∗(p) ∂L(x, p)
=
∂p j ∂p j (x∗(p),λ∗(p))

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 25 / 28


Example – Roy’s Identity

Maximize utility function

max U (x) subject to pT · x = w

The maximal utility depends on prices p and income w ab:

U ∗ = U ∗(p, w) [ indirect utility function ]

Lagrange function L(x, λ) = U (x) + λ (w − pT · x)


∂U ∗ ∂L ∂U ∗ ∂L
= = −λ∗ x ∗j and = = λ∗
∂p j ∂p j ∂w ∂w
and thus
∂U ∗ /∂p
∗ j
xj = − [ Marshallian demand function ]
∂U ∗ /∂w

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 26 / 28


Example – Shephard’s Lemma

Minimize expenses

min pT · x subject to U (x) = ū

The expenditure function (minimal expenses) depend on prices p and


level ū of utility: e = e(p, ū)

Lagrange function L(x, λ) = pT · x + λ (ū − U (x))


∂e ∂L
= = x ∗j [ Hicksian demand function ]
∂p j ∂p j

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 27 / 28


Summary

I constraint optimization
I graphical solution
I Lagrange function and Lagrange multiplier
I extremum and critical point
I bordered Hessian matrix
I global extremum
I interpretation of Lagrange multiplier
I envelope theorem

Josef Leydold – Foundations of Mathematics – WS 2023/24 15 – Lagrange Function – 28 / 28

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy