0% found this document useful (0 votes)
20 views4 pages

Controle 16

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views4 pages

Controle 16

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Continuous optimization, an introduction

Assessment
(3rd January 2017)

Exercise I
We recall that for a convex function f : X → R ∪ {+∞},
1
proxτ f (x) = arg min f (y) + ky − xk2 .
y 2τ

Evaluate proxτ f (x) for τ > 0, and


1. X = R, f (x) = − ln x for x > 0, +∞ for x < 0.
2. f (x) = ψ(kxk) where ψ : R → R∪{+∞} is a convex, even (paire) function
with ψ(0) = 0. Show first that f is a convex function, then evaluate proxτ f
in terms of proxτ ψ .

3. f (x) = kxk3 /3.

Exercise II
We consider X a Hilbert space and a strictly convex lower-semicontinuous (lsc)
function ψ : X → R ∪ {+∞} such that the interior of dom ψ, denoted D, is not
empty, D = dom ψ, ψ ∈ C 1 (D) ∩ C 0 (D), and ∂ψ(x) = ∅ for all x 6∈ D. In other
words, ∂ψ(x) is either ∅ (if x 6∈ D), or a singleton {∇ψ(x)} (if x ∈ D). We also
assume that
lim ψ(x) = +∞.
kxk→∞

We define the “Bregman distance associated to ψ”, denoted Dψ (x, y), as,
for y ∈ D and x ∈ X,

Dψ (x, y) := ψ(x) − ψ(y) − h∇ψ(y), x − yi .

1. Show that Dψ (x, y) ≥ 0, and that Dψ (x, y) = 0 ⇒ y = x. What other


estimate can we write if in addition ψ is strongly convex? Why is Dψ not a
distance in the classical sense?

2 n n
Pn Dψ in case D = X, ψ(x) = kxk /2. In case X = R , D =]0, +∞[ ,
2. Express
ψ(x) = i=1 xi ln xi .

3. Let f : X → R ∪ {+∞} a proper, convex, lsc function. Let x̄ ∈ D. We


assume that there exists x ∈ D with f (x) < +∞. Show that there exists a
unique point x̂ ∈ X such that

f (x̂) + Dψ (x̂, x̄) ≤ f (x) + Dψ (x, x̄) ∀x ∈ X. (1)

4. Explain why ∂(f + ψ) = ∂f + ∂ψ. Write the first order optimality condition
for x̂. Deduce that x̂ ∈ D.

1
5. Show (from the first order optimality condition) that for all x ∈ X,

f (x) + Dψ (x, x̄) ≥ f (x̂) + Dψ (x̂, x̄) + Dψ (x, x̂). (2)

A “nonlinear” descent algorithm. We consider a minimisation problem

min f (x) + g(x), (P )


x∈D

for f, g convex, lsc, proper functions, where f is C 1 in D and g is “simple” in


the following sense: one assume that one knows how to solve
1
min g(x) + hp, xi + Dψ (x, y)
x τ
for any τ > 0, p ∈ X and y ∈ D. We suppose in addition that there exists
L > 0 such that for any y ∈ D, x ∈ X

Df (x, y) ≤ LDψ (x, y). (3)

(Here Df (x, y) = f (x) − f (y) − h∇f (y), x − yi.) We assume that the minimisa-
tion problem has a solution. We denote F (x) = f (x) + g(x).

6. Show that if ψ is 1-convex (strongly convex with parameter 1) and f has


L-Lipschitz gradient, then (3) is true.

Given x̄ ∈ D, τ > 0, we now define the following operator: we let x̂ = Tτ (x̄)


be the solution of the minimisation problem
1
min f (x̄) + h∇f (x̄), x − x̄i + g(x) + D(x, x̄). (4)
x∈D τ

7. Explain why this problem is easy to solve. Show that if τ is small enough,
one has the following descent rule: for all x ∈ X,
1 1
F (x) + Dψ (x, x̄) ≥ F (x̂) + Dψ (x, x̂).
τ τ

8. We define the following algorithm: we choose x0 ∈ D, and for all k ≥ 0, let


xk+1 = Tτ xk , where τ ≤ L is fixed. Show that for all k ≥ 0, F (xk+1 ) ≤ F (xk ).
If x∗ is a minimiser of F in D, show that
1
F (xk ) − F (x∗ ) ≤ Dψ (x∗ , x0 ).

9. We assume that F (x) → +∞ when kxk → +∞. Why can we find x̃ ∈ D


and extract a subsequence xkl such that xkl → x̃ as l → ∞? Why is x̃ a solution
of (P )?

2
Application: minimisation in the unit simplex. One considers the case
where X = Rd ,
( d
)
X
Σ = x ∈ X : xi ≥ 0 ∀i = 1, . . . , d; xi = 1
i=1

is the unit simplex and


(
0 if x ∈ Σ
g(x) =
+∞ else.
Pd
We choose ψ(x) = i=1 xi ln xi and D =]0, +∞[d .

10. Give the expression of Dψ (x, y) for x ∈ Σ, y ∈ Σ ∩ D.

11. Show that the algorithm described in the previous part is implementable:
express in detail the computation
P of the iterations. Hint: introduce the Lagrange
multiplier for the constraint i xi = 1.

Exercise III
We consider a maximal monotone operator A in a (real) Hilbert space X. We
consider also a “metric” M , that is, a continuous, coercive, and symmetric
operator:

kM xk ≤ kM kkxk ∀x ∈ X, hM x, xi ≥ δkxk2 , hM x, yi = hx, M yi

for all x, y ∈ X, where δ > 0.

1. Show that (x, y) 7→ hM x, yi =: hx, yiM defines a scalar product which is


equivalent to the scalar product h·, ·i. Show that for all y ∈ X, the problem
1
min kxk2M − hy, xi
x 2

has a unique solution. Deduce that M is invertible. We have denoted k.kM the
Hilbertian norm induced by the M -scalar product.

2. Show that (M −1 A) is a maximal monotone operator in the M -scalar prod-


uct. Deduce from Minty’s theorem that for any y ∈ X, there exists a unique x
such that
M (x − y) + Ax 3 0.

3. We consider A, B two maximal monotone operators and K ∈ L(X, X) a


continuous, linear operator in X. We define in X × X the metric, for τ, σ > 0,
−K ∗
 I 
M := τ .
I
−K σ

Here I ∈ L(X, X) is the identity operator. Show that if τ σ < 1/kKk2 , M is


continuous and coercive in X × X.

3
4. Deduce that (for such τ, σ) one can define the following algorithm: we let
(x0 , y 0 ) ∈ X ×X and define for each k ≥ 0 the new point (xk+1 , y k+1 ) as follows:

K∗
 k+1
− xk
  k+1  
Axk+1
  
x 0 x
M + + 3 0.
y k+1 − y k −K 0 y k+1 B −1 y k+1

Express this as a first iteration defining xk+1 from xk , y k and then an iteration
defining y k+1 from xk , xk+1 , y k .

5. In what case does (xk , y k ) converge? (and in what sense?) In this case,
what does the limit (x̄, ȳ) satisfy? Write, in particular, an equation for x̄.

6. We now consider a maximal monotone operator Cx and the new iterative


scheme:
K∗
 k+1
− xk
  k+1  
Axk+1
    k
x 0 x Cx
M + + 3 .
y k+1 − y k −K 0 y k+1 B −1 y k+1 0

Under which condition on τ, σ, C will this iterative scheme be converging? To


which limit?

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy