0% found this document useful (0 votes)
98 views2 pages

Homework # 02

This document outlines 12 questions for Homework #02 in a convex optimization course. The questions cover a range of topics, including finding the minimum of univariate functions, properties of strongly convex functions, conjugates of functions, computing gradients and Hessians, optimality conditions for convex functions, solving quadratic optimization problems, subgradients, convergence of gradient descent, and applying gradient descent to minimize a bivariate function numerically. Students are asked to solve the questions, which involve proofs, derivations, and numerical computations. The deadline for submission is February 17th at 4:00 PM.

Uploaded by

Mazhar Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
98 views2 pages

Homework # 02

This document outlines 12 questions for Homework #02 in a convex optimization course. The questions cover a range of topics, including finding the minimum of univariate functions, properties of strongly convex functions, conjugates of functions, computing gradients and Hessians, optimality conditions for convex functions, solving quadratic optimization problems, subgradients, convergence of gradient descent, and applying gradient descent to minimize a bivariate function numerically. Students are asked to solve the questions, which involve proofs, derivations, and numerical computations. The deadline for submission is February 17th at 4:00 PM.

Uploaded by

Mazhar Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Homework # 02

Prof. Elena Gryazina, Mazhar Ali (TA), Oleg Khamisov (TA)


(Total: 35 points)
Submission Deadline: 17th February (Monday) 16:00 PM

Question 01
(2 point) Find minimum (i.e. both point x∗ and function value f ∗ = f (x∗ )) of function with
respect to parameters a, b, c :
f (x) = ax2 + bx + c

Question 02:
(1 point) What are dimensions of gradient of a function h(x) = f (Ax), constructed of function
f : Rm → R and matrix A ∈ Rm×k , so the dimension ∇h(x) is?

Question 03:
(3 point) Prove that for a strongly convex function with parameter µ holds,
µ
||x − x∗ ||22 ≤ f (x) − f (x∗ )
2

Question 04:
(3 point) Find conjugate f ∗ (y) and its domain for function: f (x) = 1/x, which is considered
defined only on positive arguments: dom f = {x : x > 0}.

Question 05
(4 point) Given conjugate for f ∗ (y) for a function f (x), find a) conjugate function and b) its
domain for g(x) = f (x) + (c, x) + d; c ∈ Rn ; d ∈ R.

Question 06
(3 point) Derive a) gradient ∇f (x) and b) Hessian matrix ∇2 f (x) (both in vector form) for the
function f (x) = (x, c)2 ; x ∈ Rn .

Question 07
(3 point) Derive Hessian matrix ∇2x f (x) for the function f (x) = g(Ax+b), assuming differentiable
g : Rm → R, with dimensions A ∈ Rm×n ; b ∈ Rm ; x ∈ Rn .

Question 08
(2 point) Prove sufficient first order optimality condition for (everywhere) differentiable convex
function f (x) : If ∇f (x∗ ) = 0, then x∗ is a global minimum of f ”.

1
Question 09:
(3 point) Solve optimal step-size problem for the quadratic function, with symmetric positive
definite matrix A  0 ∈ Rn×n , and x, b, d ∈ Rn . Your goal is to find optimal γ ∗ for given A, b, d, x.
The resulting expression must be written in terms of inner products (. . . , . . .)

f (γ) = (A (x + γd) , x + γd) + (b, x + γd) → min


γ∈R

Question 10:
(3 point) Derive subgradient (subdifferential) for the function f (x) = [x2 − 1]+ , x ∈ R. (do not
write subgradient method).

Question 11:
(3 point) Let f : Rn → R be given by f (x) = 21 x> Qx − x> b, where b ∈ Rn and Q is a real
symmetric matrix positive definite n × n matrix. Suppose that we apply steepest descent (or
gradient descent) method to this function with x0 6= Q−1 b. Show that method converges in one
step that is x1 = Q−1 b, if and only if x0 is chosen such that g 0 = Qx0 − b is an eigenvector of Q.

Question 12:
(5 point) Find the minimizer of,

f (x, y) = x2 + xy + 10y 2 − 22y − 5x


numerically by steepest descent.
1. For each iteration, record the values of x, y and f and include in a table.
2. Plot the values on a contour plot.

3. Explore different starting values, such as (1, 10), (10, 10), (10, 1). Does the number of steps
depend significantly on the starting guess?

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy