0% found this document useful (0 votes)
33 views3 pages

Optimization Lecture Notes

Uploaded by

smmagency1000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views3 pages

Optimization Lecture Notes

Uploaded by

smmagency1000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Optimization Methods - Detailed Notes

Lecture 7: Optimization of Single Variable Functions

Optimization aims to find the maximum or minimum values of a function. For single-variable functions,

this involves analyzing critical points, the first derivative, and the second derivative.

Key Definitions:

1. **Local Minimum**: A point where the function value is smaller than its immediate surroundings.

- Example: If f(x) = x^2, x = 0 is a local minimum because f(0) < f(-1) and f(1).

2. **Global Minimum**: A point where the function value is the smallest across the entire domain.

- Example: For f(x) = x^2, x = 0 is also the global minimum since f(0) <= f(x) for all x.

Critical Points:

A critical point is where the derivative of the function is zero or undefined. These are candidates for minima or

maxima.

- Example: For f(x) = 3x^2 - 6x, solve f'(x) = 0 to get x = 1. This is a critical point.

Theorems and Tests

1. **First Derivative Test**:

- If the derivative changes from negative to positive at a critical point, it's a local minimum.

- If it changes from positive to negative, it's a local maximum.

- If it doesn't change, it's neither.

2. **Second Derivative Test**:

Page 1
Optimization Methods - Detailed Notes

- If f''(c) > 0, the function has a local minimum at c.

- If f''(c) < 0, the function has a local maximum at c.

- If f''(c) = 0, the test is inconclusive.

**Example**:

Find and classify the critical points of f(x) = x^3 - 3x^2 + 4:

- f'(x) = 3x^2 - 6x

- Critical points: Solve 3x^2 - 6x = 0 to get x = 0 and x = 2.

- Use the second derivative: f''(x) = 6x - 6. At x = 0, f''(0) < 0 (local max). At x = 2, f''(2) > 0 (local min).

Lecture 8: Optimization of Multivariable Functions

Multivariable optimization deals with functions of two or more variables, such as f(x, y). Critical points are

found using partial derivatives.

Key Steps:

1. Compute the partial derivatives (e.g., df/dx and df/dy).

2. Set them to zero and solve for the critical points.

3. Use the second derivative test to classify the points.

Second Derivative Test:

For a function f(x, y), calculate:

- f_xx, f_yy, f_xy (second partial derivatives).

- D = f_xx * f_yy - (f_xy)^2.

Page 2
Optimization Methods - Detailed Notes

Classification:

- If D > 0 and f_xx > 0, it's a local minimum.

- If D > 0 and f_xx < 0, it's a local maximum.

- If D < 0, it's a saddle point.

**Example**:

Classify critical points of f(x, y) = x^2 + y^2 - 4x - 2y:

- Partial derivatives: f_x = 2x - 4, f_y = 2y - 2.

- Critical point: Solve f_x = 0 and f_y = 0 to get (x, y) = (2, 1).

- Second derivatives: f_xx = 2, f_yy = 2, f_xy = 0.

- D = (2)(2) - (0)^2 = 4 > 0 and f_xx = 2 > 0 (local minimum at (2, 1)).

Hessian Matrix for Multivariable Optimization

The Hessian matrix is used to classify critical points for functions with more than two variables. It is defined

as:

H=[

[f_xx, f_xy],

[f_yx, f_yy]

Eigenvalues of the Hessian matrix determine the point's nature (e.g., positive definite = local min).

Page 3

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy