0% found this document useful (0 votes)
86 views10 pages

Optimization Methods2

This document discusses optimization of functions with multiple variables. It explains that optimization of such functions is more difficult than single-variable functions due to challenges in graphical representation and complex calculations. The document outlines that optimization of multivariable functions uses the gradient vector and Hessian matrix. It states that the gradient vector must equal zero at a stationary point for a function to be optimized, and that the Hessian matrix must be positive definite for a minimum or negative definite for a maximum. An example problem demonstrates finding the stationary points and classifying them as maxima, minima or points of inflection.

Uploaded by

s200849298
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views10 pages

Optimization Methods2

This document discusses optimization of functions with multiple variables. It explains that optimization of such functions is more difficult than single-variable functions due to challenges in graphical representation and complex calculations. The document outlines that optimization of multivariable functions uses the gradient vector and Hessian matrix. It states that the gradient vector must equal zero at a stationary point for a function to be optimized, and that the Hessian matrix must be positive definite for a minimum or negative definite for a maximum. An example problem demonstrates finding the stationary points and classifying them as maxima, minima or points of inflection.

Uploaded by

s200849298
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Optimization using Calculus

Optimization of
Functions of Multiple
Variables: Unconstrained
Optimization

1 D Nagesh Kumar, IISc Optimization Methods: M2L3


Objectives

Ü To study functions of multiple variables, which are more difficult to


analyze owing to the difficulty in graphical representation and
tedious calculations involved in mathematical analysis for
unconstrained optimization.
Ü To study the above with the aid of the gradient vector and the
Hessian matrix.
Ü To discuss the implementation of the technique through examples

2 D Nagesh Kumar, IISc Optimization Methods: M2L3


Unconstrained optimization

Ü If a convex function is to be minimized, the stationary point is the


global minimum and analysis is relatively straightforward as
discussed earlier.

Ü A similar situation exists for maximizing a concave variable function.

Ü The necessary and sufficient conditions for the optimization of


unconstrained function of several variables are discussed.

3 D Nagesh Kumar, IISc Optimization Methods: M2L3


Necessary condition

In case of multivariable functions a necessary condition for a


stationary point of the function f(X) is that each partial derivative is
equal to zero. In other words, each element of the gradient vector Δ x f
defined below must be equal to zero. i.e. the gradient vector of f(X),
at X=X*, defined as follows, must be equal to zero:
⎡ ∂f * ⎤
⎢ ∂x Χ )⎥
⎢ 1 ⎥
(

⎢ ∂f * ⎥
⎢ ∂x Χ )⎥
Δx f = ⎢ ⎥=0
(

⎢ ⎥
2

⎢ ⎥
M
⎢ M ⎥
⎢ ∂f ⎥
⎢ Χ ⎥
⎣ ∂dxn ⎦
*
( )
4 D Nagesh Kumar, IISc Optimization Methods: M2L3
Sufficient condition

Ü For a stationary point X* to be an extreme point, the matrix of second


partial derivatives (Hessian matrix) of f(X) evaluated at X* must be:
Ü positive definite when X* is a point of relative minimum, and
Ü negative definite when X* is a relative maximum point.

Ü When all eigen values are negative for all possible values of X, then
X* is a global maximum, and when all eigen values are positive for
all possible values of X, then X* is a global minimum.
Ü If some of the eigen values of the Hessian at X* are positive and some
negative, or if some are zero, the stationary point, X*, is neither a
local maximum nor a local minimum.

5 D Nagesh Kumar, IISc Optimization Methods: M2L3


Example

Analyze the function f ( x) = − x12 − x22 − x32 + 2 x1 x2 + 2 x1 x3 + 4 x1 − 5 x3 + 2


and classify the stationary points as maxima, minima and points of
inflection
Solution
⎡ ∂f * ⎤
⎢ Χ )⎥

⎢ 1 ⎥ ⎡ −2 x1 + 2 x2 + 2 x3 + 4 ⎤ ⎡0 ⎤
(
x
⎢ ∂f ⎥
Δx f = ⎢ ( Χ* ) ⎥ = ⎢⎢ −2 x2 + 2 x1 ⎥ = ⎢0 ⎥
⎥ ⎢ ⎥
⎢ ∂x2 ⎥ ⎢ −2 x + 2 x − 5 ⎥ ⎢0 ⎥
⎢ ∂f ⎥ ⎣ ⎦ ⎣ ⎦
⎢ (Χ ) ⎥
3 1


⎣ 3 ⎦
*

6 D Nagesh Kumar, IISc Optimization Methods: M2L3


Example …contd.

7 D Nagesh Kumar, IISc Optimization Methods: M2L3


Example …contd.

8 D Nagesh Kumar, IISc Optimization Methods: M2L3


Example …contd.

9 D Nagesh Kumar, IISc Optimization Methods: M2L3


Thank you

10 D Nagesh Kumar, IISc Optimization Methods: M2L3

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy