0% found this document useful (0 votes)
136 views7 pages

HW4 Solutions Autotag

This document discusses convex and nonconvex optimization problems. It provides examples of using Newton's method to minimize a convex function and scipy's minimize function to solve nonconvex optimization problems. Specifically, it shows how to: 1) Use Newton's method to minimize the convex function f(x) = ex - x, obtaining the minimum at x = 0. 2) Geometrically understand Newton's method as approximating the nonlinear equation f'(x) = 0 with tangent lines. 3) Minimize two nonconvex functions over boxes or intervals, obtaining multiple local and global minima through contour and 3D plots.

Uploaded by

apple ted
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
136 views7 pages

HW4 Solutions Autotag

This document discusses convex and nonconvex optimization problems. It provides examples of using Newton's method to minimize a convex function and scipy's minimize function to solve nonconvex optimization problems. Specifically, it shows how to: 1) Use Newton's method to minimize the convex function f(x) = ex - x, obtaining the minimum at x = 0. 2) Geometrically understand Newton's method as approximating the nonlinear equation f'(x) = 0 with tangent lines. 3) Minimize two nonconvex functions over boxes or intervals, obtaining multiple local and global minima through contour and 3D plots.

Uploaded by

apple ted
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

ISyE 6669 HW 4

Spring 2021

1 Convex optimization
Convex optimization is the backbone of modern optimization. We learned some
simple algorithmic schemes such as gradient descent and the Newton method
among others. These two algorithms are especially suited to minimize convex
functions when they are continuously differentiable or have second-order deriva-
tives.

1. Use Newton’s method to Minimize the convex function f (x) = ex − x on


the entire real line x ∈ R. You need to first write down the first-order
and second-order derivatives of f (x). Then, write down a step in the
Newton’s method, e.g. xk+1 = xk − f 0 (xk )/f 00 (xk ). You need to plug in
the expressions of the derivatives. Carry out the Newton’s method from
the initial point x0 = −1. Write down xk , f (xk ), f 0 (xk ) in each iteration,
until you reach a solution with the first-order derivative |f 0 (xk )| < 10−5 .
You should only need less than 10 steps.
Solution:
We have convex function f (x) = ex − x.

1
From plot above we see that this function attains minimum at point x = 0,
and that its minimum value is f (0) = 1.

First and second order derivatives of the function are given by:

f 0 (x) = ex − 1
f 00 (x) = ex

Now let us use Newthon method to find minimum of the function:


Iteration 0:

x0 = −1
f (x0 ) = 1.3678794411714423
f 0 (x0 ) = −0.6321205588285577
f 00 (x0 ) = 0.36787944117144233
|f 0 (x0 )| = 0.6321205588285577 ≥ 10−5 ⇒ continue

Iteration 1:
f 0 (x0 )
x1 = x0 − = 0.7182818284590451
f 00 (x0 )
f (x1 ) = 1.332624544233456
f 0 (x1 ) = 1.0509063726925012
f 00 (x1 ) = 2.050906372692501
|f 0 (x1 )| = 1.0509063726925012 ≥ 10−5 ⇒ continue

Iteration 2:
f 0 (x1 )
x2 = x1 − = 0.20587112717830613
f 00 (x1 )
f (x2 ) = 1.0227237341276951
f 0 (x2 ) = 0.22859486130600137
f 00 (x2 ) = 1.2285948613060014
|f 0 (x2 )| = 0.22859486130600137 ≥ 10−5 ⇒ continue

2
Iteration 3:
f 0 (x2 )
x3 = x2 − = 0.019809091184598587
f 00 (x2 )
f (x3 ) = 1.0001975020028975
f 0 (x3 ) = 0.02000659318749598
f 00 (x3 ) = 1.020006593187496
|f 0 (x3 )| = 0.02000659318749598 ≥ 10−5 ⇒ continue

Iteration 4:
f 0 (x3 )
x4 = x3 − = 0.00019491092231630272
f 00 (x3 )
f (x4 ) = 1.000000018996368
f 0 (x4 ) = 0.00019492991868430565
f 00 (x4 ) = 1.0001949299186843
|f 0 (x4 )| = 0.00019492991868430565 ≥ 10−5 ⇒ continue

Iteration 5:
f 0 (x4 )
x5 = x4 − = 0.0000000189938997
f 00 (x4 )
f (x5 ) = 1.0000000000000002
f 0 (x5 ) = 0.0000000189939000
f 00 (x5 ) = 1.0000000189939
|f 0 (x5 )| = 0.0000000189939000 < 10−5 ⇒ stop

Using Newton’s method we get optimal solution x = 0.0000000189938997


and optimal value f (x) = 1.0000000000000002, which is quite close to the
true values.
2. To minimize a convex function f (x) without any constraint, it is equivalent
to solving the first-order optimality condition f 0 (x) = 0, i.e. find the point
x∗ on the curve y = f 0 (x) that crosses the horizon line y = 0. Such a x∗
is also called a zero of the equation f 0 (x) = 0. The curve y = f 0 (x) is
usually a nonlinear curve. Think of Question 1. What Newton’s method
actually does is to solve this nonlinear equation by solving a sequence of
linear equations that approximate this nonlinear equation. How do we
approximate a nonlinear curve by a linear one? Right, use the tangent
line at a point of the nonlinear curve, or equivalently, use the Taylor’s
expansion we all know from calculus.

3
So suppose we want to solve f 0 (x) = 0 and suppose the second-order
derivative f 00 (x) always exists. First, write down the tangent line of the
curve y = f 0 (x) at a point xk . Hint: the line passing through the point
(x = a, y = b) with slope k has the equation y = k(x − a) + b.
Suppose your equation for the tangent line is y = k(x − a) + b. Of course,
you need to fill in k, a, b and express them in xk , f 0 (xk ), f 00 (xk ). Then,
solve the equation k(x − a) + b = 0 and write down the expression of the
solution in k, a, b. The solution is the next iteration xk+1 . If you have
done everything correctly, you should recover the Newton’s iterate. The
next step of Newton’s method starts from xk+1 , forms the tangent line of
the curve y = f 0 (x) at xk+1 , and finds the zero of this linear equation,
and continues.
Plot y = f 0 (x) for f (x) = ex − x. Start from x0 = −1. Draw the tangent
line at x0 , find the zero, and continue, until you reach x2 . You should see
the same sequence as you find in the first question, and this should give
you a geometric understanding of Newton’s method.
Solution: In this part we will search for minimum by trying to solve
f 0 (x) = 0. In order to do that we will plot tangent to f 0 (x) = ex − 1
at xi . Using formula y = k(x − a) + b and substituting k = f 00 (xi ), a =
xi , b = f 0 (xi ) we get that equation of the tangent line at point xi is given
by y = f 00 (xi )x + f 0 (xi ) − f 00 (xi )xi . In order to find next point, we find
intercept of the tangent line and x− axis.

f 0 (xi )
f 00 (xi )x + f 0 (xi ) − f 00 (xi )xi = 0 ⇒ x = xi −
f 00 (xi )
This is the same formula we used in previous part.
As before we start at the point x0 = −1. Equation of a tangent to a
f 0 (x) = ex − 1 at point x0 = −1 is given by

y = 0.36787944117144233 × x − 0.26424111765711533

We find point x1 by finding intercept of the tangent line and x− axis:

0.36787944117144233×x−0.26424111765711533 = 0 ⇒ x1 = 0.7182818284590451.

At point x1 we get tangent line:

y = 2.050906372692501 ∗ x − 0.4222224066833764.

We get next point by solving:

2.050906372692501∗x−0.4222224066833764 = 0 ⇒ x2 = 0.20587112717830613.

4
At point x2 , equation of a tangent line is given by:

y = 1.2285948613060014 ∗ x − 0.02433734763653983.

We could continue this process, until we reach stopping criteria as before,


but we will stop here. As expected , points x0 , x1 , x2 we obtained this
way, are same points we obtained in Q1.1.
Bellow you can see plot of the function f 0 (x) and first three tangents.

2 Nonconvex optimization
The following problems involve optimization of nonlinear nonconvex functions
with some simple constraints, such as an interval or a box in higher dimension.
To minimize each of the following functions, you can use the command minimize
from scipy.optimize. Due to the nonconvex and sometimes nonsmooth (i.e.
not differentiable at some points) nature of the objective function, you need to
be careful about the starting point and the constraints you set. For example,
you may need to set the box small enough to help the solver find a good local
optimum. You need to provide a printout of your code, along with the solution.

5
1. f (x1 , x2 ) = (1 − x1 + x1 x2 )2 + (2 − x1 + x21 x2 )2 + (3 − x1 + x13 x2 )2 over the
box −5 ≤ x1 , x2 ≤ 5. Start from (0, 0). Plot the function f (x1 , x2 ) over
the box [−5, 5]2 using both a 2D contour plot and a 3D plot.
Solution:
Optimal value: 1.1684123071910781
Optimal solution: x1 = 1.51907761, x2 = −0.27672452
Bellow are 2D and 3D plots of the function. From 2D plot bellow we see
that solution is where we would expect it to be.

2. Consider the following function


p ! !0.1
x21 + x22
f (x1 , x2 ) = −0.0001 sin(x1 ) sin(x2 ) exp 100 − +1 .

π

This function has a lot of local minima and global minima. Plot f (x1 , x2 )
over the box [−10, 10]2 using both a 2D contour plot and a 3D plot. Try
to find at least two different global minima and one local minimum that is
not a global minimum. Hint: You can start the algorithm that you choose
from different starting points.
Solution:
Bellow you can see 2D and 3D plots of the function. We see that function
has multiple local and global minima.

6
Global minima:
Global minimum obtained using starting point (−1, −1): −2.0626118707233263
Optimal solution: x1 = −1.3493852, x2 = −1.3493852
Global minimum obtained using starting point (1, 1): −2.062611870736794
Optimal solution: x1 = 1.3493867, x2 = 1.3493867
Global minimum obtained using starting point (−1, 1): −2.0626118707360184
Optimal solution: x1 = −1.3493867, x2 = 1.34938652
Global minimum obtained using starting point (1, −1): −2.0626118707360184
Optimal solution: x1 = 1.34938652, x2 = −1.3493867

Local minima different than global minima:


Global minimum obtained using starting point (−5, −1): −1.8899380457820998
Optimal solution: x1 = −4.41908156, x2 = −1.47062333
Global minimum obtained using starting point (−1, −1): −1.554463337514644
Optimal solution: x1 = 7.63258721, x2 = −7.63253296

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy