0% found this document useful (0 votes)
79 views12 pages

Optimization Using Calculus: Optimization of Functions of Multiple Variables: Unconstrained Optimization

This document discusses unconstrained optimization of functions with multiple variables. It explains that the gradient vector and Hessian matrix are used to find stationary points where partial derivatives are equal to zero. A necessary condition for an extreme point is that the gradient is zero, while a sufficient condition is that the Hessian matrix is positive or negative definite, indicating a minimum or maximum. An example analyzes a function and classifies its stationary points.

Uploaded by

fedy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
79 views12 pages

Optimization Using Calculus: Optimization of Functions of Multiple Variables: Unconstrained Optimization

This document discusses unconstrained optimization of functions with multiple variables. It explains that the gradient vector and Hessian matrix are used to find stationary points where partial derivatives are equal to zero. A necessary condition for an extreme point is that the gradient is zero, while a sufficient condition is that the Hessian matrix is positive or negative definite, indicating a minimum or maximum. An example analyzes a function and classifies its stationary points.

Uploaded by

fedy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 12

Optimization using Calculus

Optimization of
Functions of Multiple
Variables: Unconstrained
Optimization

1 D Nagesh Kumar, IISc Optimization Methods: M2L3


Objectives

 To study functions of multiple variables, which are more difficult to


analyze owing to the difficulty in graphical representation and
tedious calculations involved in mathematical analysis for
unconstrained optimization.
 To study the above with the aid of the gradient vector and the
Hessian matrix.
 To discuss the implementation of the technique through examples

2 D Nagesh Kumar, IISc Optimization Methods: M2L3


Unconstrained optimization

 If a convex function is to be minimized, the stationary point is the


global minimum and analysis is relatively straightforward as
discussed earlier.

 A similar situation exists for maximizing a concave variable function.

 The necessary and sufficient conditions for the optimization of


unconstrained function of several variables are discussed.

3 D Nagesh Kumar, IISc Optimization Methods: M2L3


Necessary condition

 In case of multivariable functions a necessary condition for a


stationary point of the function f(X) is that each partial derivative is
equal to zero. In other words, each element of the gradient vector  x f
defined below must be equal to zero. i.e. the gradient vector of f(X),
at X=X*, defined as follows, must be equal to zero:
 f * 
 x (  )
 1 
 f * 
 x (  )
x f   0
2

 
 
 
 f 
 (  *
) 
 dxn 
4 D Nagesh Kumar, IISc Optimization Methods: M2L3
Sufficient condition

 For a stationary point X* to be an extreme point, the matrix of second


partial derivatives (Hessian matrix) of f(X) evaluated at X* must be:
 positive definite when X* is a point of relative minimum, and
 negative definite when X* is a relative maximum point.

 When all eigen values are negative for all possible values of X, then
X* is a global maximum, and when all eigen values are positive for
all possible values of X, then X* is a global minimum.
 If some of the eigen values of the Hessian at X* are positive and some
negative, or if some are zero, the stationary point, X*, is neither a
local maximum nor a local minimum.

5 D Nagesh Kumar, IISc Optimization Methods: M2L3


Example

Analyze the function f ( x)   x12  x22  x32  2 x1 x2  2 x1 x3  4 x1  5 x3  2


and classify the stationary points as maxima, minima and points of
inflection
Solution

6 D Nagesh Kumar, IISc Optimization Methods: M2L3


Example …contd.

7 D Nagesh Kumar, IISc Optimization Methods: M2L3


Example …contd.

8 D Nagesh Kumar, IISc Optimization Methods: M2L3


Theorem. The eigenvalues of a triangular matrix are its diagonal entries

Proof: Let the triangular matrix be


 a11 a12 a1n 
0 a a 
A 22 2n 

 
 
 0 0 a nn 

The characteristic equation of A is


 a11   a12 a1n 
 0 a22   a2 n 
det  0
 
 
 0 0 ann   
or (a11 -  )(a22 -  ) (ann -  )  0
Hence 1  a11' ,  2  a22' ,  n  ann ' .
Example:
1 1
A 
 2 4 
The characteristic equation of A is
1   1 
det(A- I) = det   0
 2 4  
(1   )(4   )  2  0
 2  5  6  0
The eigenvalues are therefore
1  2, 2  3
Example …contd.

11 D Nagesh Kumar, IISc Optimization Methods: M2L3


Thank you

12 D Nagesh Kumar, IISc Optimization Methods: M2L3

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy