Lecture 14 From Sensitivities To Optimisation
Lecture 14 From Sensitivities To Optimisation
Contributors
Simon Funke
1 / 12
What is PDE-constrained optimisation?
2 / 12
What is PDE-constrained optimisation?
2 / 12
Hello World of PDE-constrained optimisation!
We will solve the optimal control of the Poisson equation:
Z Z
1 2 α
min ku − ud k dx + kmk2 dx
u,m 2 Ω 2 Ω
subject to
−∆u = m in Ω
u = u0 on ∂Ω
3 / 12
Hello World of PDE-constrained optimisation!
We will solve the optimal control of the Poisson equation:
Z Z
1 2 α
min ku − ud k dx + kmk2 dx
u,m 2 Ω 2 Ω
subject to
−∆u = m in Ω
u = u0 on ∂Ω
3 / 12
The canconical abstract form
min J(u, m)
u,m
subject to:
F (u, m) = 0,
with
• the objective functional J.
• the parameter m.
• the PDE operator F with solution u, parametrised by m.
4 / 12
The reduced problem
˜
min J(m) = J(u(m), m)
m
with
˜
• the reduced functional J.
• the parameter m.
5 / 12
The reduced problem
˜
min J(m) = J(u(m), m)
m
with
˜
• the reduced functional J.
• the parameter m.
5 / 12
Gradient descent
Algorithm
1 Choose initial parameter value m0 and γ > 0.
2 For i = 0, 1, . . . :
• mi+1 = mi − γ∇J(mi )
Features
+ Easy to implement.
– Slow convergence.
6 / 12
Newton method
˜
Optimisation problem: minm J(m).
Optimality condition:
˜
∇J(m) = 0. (1)
Features
+ Fast (locally quadratic) convergence.
– Requires iteratively solving a linear system with
the Hessian, which might require many Hessian
action computations.
– Hessian might not be positive definite, resulting in
an update δm which is not a descent direction.
7 / 12
Newton method
˜
Optimisation problem: minm J(m).
Optimality condition:
˜
∇J(m) = 0. (1)
Features
+ Fast (locally quadratic) convergence.
– Requires iteratively solving a linear system with
the Hessian, which might require many Hessian
action computations.
– Hessian might not be positive definite, resulting in
an update δm which is not a descent direction.
7 / 12
Newton method
˜
Optimisation problem: minm J(m).
Optimality condition:
˜
∇J(m) = 0. (1)
Features
+ Fast (locally quadratic) convergence.
– Requires iteratively solving a linear system with
the Hessian, which might require many Hessian
action computations.
– Hessian might not be positive definite, resulting in
an update δm which is not a descent direction.
7 / 12
Newton method
˜
Optimisation problem: minm J(m).
Optimality condition:
˜
∇J(m) = 0. (1)
Features
+ Fast (locally quadratic) convergence.
– Requires iteratively solving a linear system with
the Hessian, which might require many Hessian
action computations.
– Hessian might not be positive definite, resulting in
an update δm which is not a descent direction.
7 / 12
Quasi-Newton methods
8 / 12
Quasi-Newton methods
8 / 12
Solving the optimal Poisson problem
Python code
from fenics import *
from dolfin_adjoint import *
J = Functional ( inner (s , s ) * dx )
m = SteadyParameter ( f )
Tipps
• You can call print optimization methods() to list all
available methods.
• Use maximize if you want to solve a maximisation
problem.
9 / 12
Solving the optimal Poisson problem
Python code
from fenics import *
from dolfin_adjoint import *
J = Functional ( inner (s , s ) * dx )
m = SteadyParameter ( f )
Tipps
• You can call print optimization methods() to list all
available methods.
• Use maximize if you want to solve a maximisation
problem.
9 / 12
Bound constraints
10 / 12
Inequality constraints
11 / 12
The FEniCS challenge!
12 / 12