29-Introduction To Classical Optimization-20-03-2024
29-Introduction To Classical Optimization-20-03-2024
Introduction to Optimization
Goal of decisions minimize the effort required
(or)
maximize the desired benefit
Optimization plays a very important role in designs of various engineering problems and
of course in Design engineering, Manufacturing engineering and System engineering.
Engineering Applications of Optimization
Design of aerospace structures for minimum weight
Optimal trajectories of space vehicles
Minimum weight design of structures for random loading
Optimal design of linkages, cams, gears, machine tools and other mechanical components
Selection of machining conditions in metal cutting processes for minimum production cost
Design of material handling equipment like conveyors, trucks and cranes for minimum cost
Design of pumps, turbines and heat transfer equipment for maximum efficiency
Optimum design of control systems
Optimal production planning, controlling and scheduling
Analysis of statistical data and building empirical models from experimental results to
obtain the most accurate representation of the physical phenomenon
Planning of maintenance and replacement of equipment to reduce operation costs
Allocation of resources or services among several activities to maximize the benefit
General Statement of Optimization Problem
Classification of Optimization
Classification based on
The existence of constraints
Constrained and Unconstrained
The nature of design variables
Parameter or Static optimization problems
Trajectory or Dynamic optimization problems
The physical structure of the problem
Optimal control and non-optimal control problems
The nature of equations involved
linear, nonlinear, geometric and quadratic programming problems
The permissible values of the design variables
Integer programming problem, Real valued programming problem
The deterministic nature of variables involved
Deterministic Programming problem, Stochastic programming problem
The Separability of the functions
Separable and non-separable programming problems
The number of objective functions
Single and Multi-objective programming problems
Classical Optimization Techniques
• The classical optimization techniques are useful in finding the optimum solution or
unconstrained maxima or minima of continuous and differentiable functions.
• These are analytical methods and make use of differential calculus in locating the
optimum solution.
• The classical methods have limited scope in practical applications as some of them
involve objective functions which are not continuous and/or differentiable.
• Yet, the study of these classical techniques of optimization form a basis for developing
most of the numerical techniques that have evolved into advanced techniques more
suitable to today’s practical problems.
Classical Optimization Techniques
• These methods assume that the function is differentiable twice with respect to
the design variables and the derivatives are continuous.
Stationary Points
Single Variable Optimization - Problem
• A single-variable optimization problem is one in which the value 𝑥 = 𝑥∗ is to be found
in the interval [a, b] such that 𝑥∗ minimizes 𝑓(𝑥).
• For example, determine the maximum and minimum values of the function.
𝑓 𝑥 = 3𝑥 4 − 4𝑥 3 − 24𝑥 2 + 48𝑥 + 15
• Two theorems provide the necessary and sufficient conditions for the relative
minimum of a function of a single variable
• Necessary condition
• Sufficient condition
Single Variable Optimization - Solution
The theorem does not say that the function necessarily will have a minimum or
maximum at every point where the derivative is zero.
iii. Neither a minimum nor a maximum if n is odd. In this case x* is a point of inflection.
Single Variable Optimization – Example 1
Determine the maximum and minimum values of the function:
𝒇 𝒙 = 𝟑𝒙𝟒 − 𝟒𝒙𝟑 − 𝟐𝟒𝒙𝟐 + 𝟒𝟖𝒙 + 𝟏𝟓
Solution:
For finding out the stationary points necessary condition is to be applied.
Conclusion:
𝑎𝑡 𝑥 = 1, 𝑓 𝑥 has relative maxima and the value is 𝑓 𝑥 = 1 = 38
𝑎𝑡 𝑥 = 2, , 𝑓 𝑥 has relative minima and the value is 𝑓 𝑥 = 2 = 31
𝑎𝑡 𝑥 = −2, , 𝑓 𝑥 has relative minima and the value is 𝑓 𝑥 = −2 = -97
Single Variable Optimization – Example 2
Determine the stationary points of the following function, 𝒇 𝒙 = 𝟐𝒙𝟔 − 𝟔𝒙𝟒 + 𝟔𝒙𝟐 + 𝟏𝟎
Solution:
𝑓 𝑥 = 2𝑥 6 − 6𝑥 4 + 6𝑥 2 + 10
𝑓 ′ 𝑥 = 12𝑥 5 − 24𝑥 3 + 12𝑥 = 0
= 12𝑥 𝑥 2 − 1 2 = 0
= 12𝑥 (𝑥 + 1)(𝑥 − 1) 2 = 0
Stationary points⇒ 𝒙 = 𝟎, 𝒙 = +𝟏, 𝒙 = −𝟏
𝑓 ′′ 𝑥 = 60𝑥 4 − 72𝑥 2 + 12
• 𝒂𝒕 𝒙 = 𝟎, 𝒇′′ 𝒙 = 𝟏𝟐 > 𝟎 (relative minima)
• 𝑎𝑡 𝑥 = +1, 𝑓 ′′ 𝑥 = 0 We could not conclude for
𝑥 = +1 and for 𝑥 = −1,
′′
• 𝑎𝑡 𝑥 = −1, 𝑓 𝑥 = 0 since 𝑓 ′′ 𝑥 = 0.
n=3, the order of the derivative is ODD and the values are non-zero.
Neither minima nor minima.
𝒙 = +𝟏 and 𝒙 = −𝟏 are points of inflection.
Multi Variable Optimization – Unconstrained NLP
Need to investigate the necessary and sufficient conditions for the minimum and
maximum of unconstrained function of several variables.
Multi Variable Optimization – Unconstrained NLP
𝜕𝑓 𝑋 ∗ 𝜕𝑓 𝑋 ∗ 𝜕𝑓 𝑋 ∗ 𝜕𝑓 𝑋 ∗
= = = ⋯……… = =0
𝜕𝑥1 𝜕𝑥2 𝜕3 𝜕𝑥𝑛
Multi Variable Optimization – Unconstrained NLP
1. At the extreme point, if the matrix of second partial derivatives (Hessian matrix) is
positive definite then it is relative minimum (strictly a convex function).
2. At the extreme point, if the matrix of second partial derivatives (Hessian matrix) is
negative definite then it is relative maximum (strictly a concave function).
3. At the extreme point, if the matrix of second partial derivatives (Hessian matrix) is
neither positive nor negative definite then it is a saddle point.
Multi Variable Optimization – Unconstrained NLP
Hessian matrix
𝜕𝑓 𝜕2 𝑓 𝜕2 𝑓 𝜕2 𝑓
= 4𝑥3 𝜕𝑥12 𝜕𝑥1 𝜕𝑥2 𝜕𝑥1 𝜕𝑥3
𝜕𝑥3 2 −2 0
𝜕2 𝑓 𝜕2 𝑓 𝜕2 𝑓
H= = −2 2 0
𝜕𝑥2 𝜕𝑥1 𝜕𝑥22 𝜕𝑥2 𝜕𝑥3
𝜕2 𝑓 𝜕2 𝑓 𝜕2 𝑓
0 0 4
𝜕𝑥3 𝜕𝑥1 𝜕𝑥3 𝜕𝑥2 𝜕𝑥32
Working Rule to Find the Extreme Points of Functions of n-Variables
Let us consider u = f (𝑋1 , 𝑋2 , 𝑋3 , ....., 𝑋𝑛 ) as a function of 𝑋1 , 𝑋2 , 𝑋3 , ....., 𝑋𝑛 .
Working Rule to Find the Extreme Points of Functions of n-Variables
Its leading minors are defined as Hence, the following cases will arise:
Multi Variable Optimization – Unconstrained NLP : Example 1
Find the extreme points of the function
𝒇(𝒙𝟏 , 𝒙𝟐 ) = 𝒙𝟑𝟏 + 𝒙𝟑𝟐 + 𝟐𝒙𝟐𝟏 + 𝟒𝒙𝟐𝟐 + 𝟔
Solution:
The necessary conditions for the existence of an extreme point are:
𝜕𝑓
= 3𝑥12 + 4𝑥1 = 𝑥1 (3𝑥1 + 4) = 0
𝜕𝑥1
𝜕𝑓
= 3𝑥22 + 8𝑥2 = 𝑥2 (3𝑥2 + 8) = 0
𝜕𝑥2
These equations are satisfied at the points: (0,0), (0, -8/3), (-4/3, 0), and (-4/3, -8/3)
Multi Variable Optimization – Unconstrained NLP : Example 1
Sufficiency conditions (Hessian Matrix):
𝜕𝑓 𝜕𝑓
= 3𝑥12 + 4𝑥1 ; = 3𝑥22 + 8𝑥2 = 0
𝜕𝑥1 𝜕𝑥2
𝜕2𝑓 𝜕2𝑓
𝜕𝑥12 𝜕𝑥1 𝜕𝑥2 6𝑥1 + 4 0
𝑯 = =
(𝑥1 ∗,𝑥2 ∗) 𝜕2𝑓 𝜕2𝑓 0 6𝑥2 + 8
𝜕𝑥2 𝜕𝑥1 𝜕𝑥22 (𝑥1 ∗,𝑥2 ∗)
Multi Variable Optimization – Unconstrained NLP : Example 1
𝜕2𝑓 𝜕2𝑓
𝜕𝑥12 𝜕𝑥1 𝜕𝑥2
𝑯 =
(𝑥1 ∗,𝑥2 ∗) 𝜕2𝑓 𝜕2𝑓
𝜕𝑥2 𝜕𝑥1 𝜕𝑥22 (𝑥1 ∗,𝑥2 ∗)
Extreme H1 H2 Nature of H Nature of f (X)
Points (X) extreme
6𝑥1 + 4 0 points (X)
=
0 6𝑥2 + 8 (0,0) +4 +32 Positive definite Relative 6
minimum
Classical optimization theory uses differential calculus to determine points of maxima and
minima (extrema) for unconstrained and constrained functions. The methods may not be
suitable for efficient numerical computations, but the under-lying theory provides the basis
for most nonlinear programming algorithms.
Introduction – Optimization Problem
Constants Responses
Model
Design variables
D = diameter of the can (cm)
H = height of the can (cm)
Objective function
The problem has two independent design variables and five explicit constraints.
Optimization Problem – Characteristics
Linearity
x2 f
f
x2 x1
x x1
f(x6) is a global or absolute maximum, and f(x1) and f(x3) are local or relative maxima.
Similarly, f(x4) is a local minimum and f(x2) is a global minimum.
Optimization Problem – Characteristics
Single‐Variable Optimization
Let f (x) be a continuous function of single variable x defined in interval [a, b].
Local Maxima
A function f (x) with single variable is said to have a local (relative) maxima at x = 𝑿𝟎
If f (𝑿𝟎 ) ≥ f (𝑋0 + h)
A function f (x) with single variable is said to have a local (relative) minima at x = 𝑿𝟎
If f (𝑿𝟎 ) ≤ f (𝑿𝟎 + h)
Single‐Variable Optimization
A function of one variable f (x) is said to have a relative or local minimum at x = x*
if f (x*) ≤ f (x* + h) for all sufficiently small positive and negative values of h.
A function f (x) is said to have a global or absolute minimum at x* if f (x*) ≤ f (x) for all x, and not
just for all x close to x*, in the domain over which f (x) is defined.
Its leading minors are defined as Hence, the following cases will arise:
Example 1
Assume the following relationship for revenue and cost functions. Find out at what
level of output x, where x is measured in tons per week, profit is maximum
-
Draw a table of variations to study the derivative around stationary
and critical
points :
A table of variations must contain
all stationary and critical
the value of the function at stationary and critical points
the intervals between and around the stationary and critical
points
the sign of the derivative in these intervals
If the company’s expenditure or interest, rent and salary of the staff be Rs. 1 lakh, show
that the company will always be in loss.
Optimization Problem – Characteristics
Unimodality
Bracketing and sectioning methods work best for unimodal functions:
“An unimodal function consists of exactly one monotonically increasing
and decreasing part”
Optimization Problem – Characteristics
Convexity
Convex set:
“A set S is convex if for every two points x1, x2 in S, the connecting line also lies
completely inside S”
Lagrange Multipliers
The Lagrangian multiplier method can be used for any number of equality
constraints.
Suppose we have a classical problem formulation with k equality
constraints
minimize f(x1, x2, ..., xn)
Subject to h1(x1, x2, ..., xn) = 0
......
hk(x1, x2, ..., xn) = 0
COST of Manufacturing;
C (x, y)= 6x2 + 12y2
Single-variable methods
🞑 0th order (involving only f )
🞑 1st order (involving f and f ’ )
🞑 2nd order (involving f, f ’ and f ” )
Multiple variable methods
🞑 Gradient Descent Methods
🞑 Simplex Method
🞑 Sequential Linear Programming
🞑 Sequential Quadratic Programming
🞑 Etc.