Optimization Lecture Notes
Optimization Lecture Notes
Optimization aims to find the maximum or minimum values of a function. For single-variable functions,
this involves analyzing critical points, the first derivative, and the second derivative.
Key Definitions:
1. **Local Minimum**: A point where the function value is smaller than its immediate surroundings.
- Example: If f(x) = x^2, x = 0 is a local minimum because f(0) < f(-1) and f(1).
2. **Global Minimum**: A point where the function value is the smallest across the entire domain.
- Example: For f(x) = x^2, x = 0 is also the global minimum since f(0) <= f(x) for all x.
Critical Points:
A critical point is where the derivative of the function is zero or undefined. These are candidates for minima or
maxima.
- Example: For f(x) = 3x^2 - 6x, solve f'(x) = 0 to get x = 1. This is a critical point.
- If the derivative changes from negative to positive at a critical point, it's a local minimum.
Page 1
Optimization Methods - Detailed Notes
**Example**:
- f'(x) = 3x^2 - 6x
- Use the second derivative: f''(x) = 6x - 6. At x = 0, f''(0) < 0 (local max). At x = 2, f''(2) > 0 (local min).
Multivariable optimization deals with functions of two or more variables, such as f(x, y). Critical points are
Key Steps:
Page 2
Optimization Methods - Detailed Notes
Classification:
**Example**:
- Critical point: Solve f_x = 0 and f_y = 0 to get (x, y) = (2, 1).
- D = (2)(2) - (0)^2 = 4 > 0 and f_xx = 2 > 0 (local minimum at (2, 1)).
The Hessian matrix is used to classify critical points for functions with more than two variables. It is defined
as:
H=[
[f_xx, f_xy],
[f_yx, f_yy]
Eigenvalues of the Hessian matrix determine the point's nature (e.g., positive definite = local min).
Page 3