0% found this document useful (0 votes)
21 views2 pages

Ps 3

This document contains 5 exercises related to advanced graph algorithms and optimization from a course. The exercises cover topics like accelerated gradient descent, Jensen's inequality, and spectral graph theory. Students are encouraged but not required to complete the exercises, which involve proving theoretical claims about convex functions, inequalities, and graph properties.

Uploaded by

Maja Gwozdz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views2 pages

Ps 3

This document contains 5 exercises related to advanced graph algorithms and optimization from a course. The exercises cover topics like accelerated gradient descent, Jensen's inequality, and spectral graph theory. Students are encouraged but not required to complete the exercises, which involve proving theoretical claims about convex functions, inequalities, and graph properties.

Uploaded by

Maja Gwozdz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Advanced Graph Algorithms and Optimization Spring 2021

Accelerated Gradient Descent, Spectral Graph Theory


R. Kyng & M. Probst Problem Set 2 — Tuesday, March 9nd

The exercises for this week will not count toward your grade, but you are highly encouraged to
solve them all.

Exercise 1.

This exercise asks you to prove Claim 3.4.2 from Chapter 3, Section 3.4 on Accelerated Gradient
Decscent.
Prove that

a0
1. v 0 = x 0 − σ ∇f (x 0 )

2. L0 = f (x 0 ) − a0
2σ k∇f (x 0 )k22 − σ
2a0 kx ∗ − x 0 k22 .

Exercise 2.

This exercise asks you to prove Claim 3.4.3 from Chapter 3, Section 3.4 on Accelerated Gradient
Decscent.
Prove that

1. mi (v ) = mi (v i ) + σ
2 kv − v i k22
2. mi+1 (v ) = mi (v ) + ai+1 f (x i+1 ) + hai+1 ∇f (x i+1 ), v − x i+1 i
ai+1
3. v i+1 = v i − σ ∇f (x i+1 ).

Exercise 3.

This exercise is a straggler from last week, where we studied convexity. It will teach you about
Jensen’s inequality, one of the most important inequalities that we use when studying convex
functions.

1. Assume that S ⊆ Rn is a convex set and that the function f : S → R is convex. Suppose
that x 1 , · · · , x n ∈ S and θ1 , · · · , θn ≥ 0 with θ1 + · · · + θn = 1. Prove that
f (θ1 x 1 + · · · + θn x n ) ≤ θ1 f (x 1 ) + · · · + θn f (x n ).
Remark. This is typically known as Jensen’s inequality and can be extended to infinite sums.
If D is a probability distribution on S, and X ∼ D, then
f ( E [X ]) ≤ E [f (X )]

1
whenever both integrals are finite.

2. Prove that !1
n n n
Y 1X
xi ≤ xi .
n
i=1 i=1

3. Prove that !1
n n
1 Y
1 Pn 1 ≤ xi .
n i=1 xi i=1

Exercise 4

Let Pn be the path from vertex 1 to n and G1,n be the graph with only the edge between vertex 1
and n. Furthermore, assume that the edge between vertex i and i + 1 has positive weight wi for
1 ≤ i ≤ n − 1. Prove that
n−1
X 1 n−1
!
X
G1,n  wi Gi,i+1 .
wi
i=1 i=1

Exercise 5

In Chapter 4, we proved that


1
λ2 (Td ) ≥ .
(n − 1) log2 n
Improve this bound to λ2 (Td ) ≥ 1/cn for some constant c > 0.
Hint: Use the result of previous exercise.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy