0% found this document useful (0 votes)
3 views20 pages

Lecture 14 From Sensitivities To Optimisation

The document discusses PDE-constrained optimization, focusing on problems where at least one constraint is a partial differential equation, with applications in data assimilation, shape optimization, and optimal control. It introduces the optimal control of the Poisson equation as a case study, detailing methods such as gradient descent, Newton's method, and quasi-Newton methods for solving these problems. Additionally, it provides Python code examples for implementing these methods using the FEniCS library and outlines challenges for practical application.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views20 pages

Lecture 14 From Sensitivities To Optimisation

The document discusses PDE-constrained optimization, focusing on problems where at least one constraint is a partial differential equation, with applications in data assimilation, shape optimization, and optimal control. It introduces the optimal control of the Poisson equation as a case study, detailing methods such as gradient descent, Newton's method, and quasi-Newton methods for solving these problems. Additionally, it provides Python code examples for implementing these methods using the FEniCS library and outlines challenges for practical application.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

FEniCS Course

Lecture 14: From sensitivities to


optimisation

Contributors
Simon Funke
1 / 12
What is PDE-constrained optimisation?

Optimisation problems where at least one constrained is a


partial differential equation
Applications
• Data assimilation.
Example: Weather modelling.
• Shape and topology optimisation.
Example: Optimal shape of an aerfoil.
• Parameter estimation.
• Optimal control.
• ...

2 / 12
What is PDE-constrained optimisation?

Optimisation problems where at least one constrained is a


partial differential equation
Applications
• Data assimilation.
Example: Weather modelling.
• Shape and topology optimisation.
Example: Optimal shape of an aerfoil.
• Parameter estimation.
• Optimal control.
• ...

2 / 12
Hello World of PDE-constrained optimisation!
We will solve the optimal control of the Poisson equation:

Z Z
1 2 α
min ku − ud k dx + kmk2 dx
u,m 2 Ω 2 Ω
subject to
−∆u = m in Ω
u = u0 on ∂Ω

• This problem can be physically interpreted as: Find the


heating/cooling term m for which u best approximates the
desired heat distribution ud .
• The second term in the objective functional, known as
Thikhonov regularisation, ensures existence and uniqueness
for α > 0.

3 / 12
Hello World of PDE-constrained optimisation!
We will solve the optimal control of the Poisson equation:

Z Z
1 2 α
min ku − ud k dx + kmk2 dx
u,m 2 Ω 2 Ω
subject to
−∆u = m in Ω
u = u0 on ∂Ω

• This problem can be physically interpreted as: Find the


heating/cooling term m for which u best approximates the
desired heat distribution ud .
• The second term in the objective functional, known as
Thikhonov regularisation, ensures existence and uniqueness
for α > 0.

3 / 12
The canconical abstract form

min J(u, m)
u,m

subject to:
F (u, m) = 0,

with
• the objective functional J.
• the parameter m.
• the PDE operator F with solution u, parametrised by m.

4 / 12
The reduced problem

˜
min J(m) = J(u(m), m)
m

with
˜
• the reduced functional J.
• the parameter m.

How do we solve this problem?


• Gradient descent.
• Newton method.
• Quasi-Newton methods.

5 / 12
The reduced problem

˜
min J(m) = J(u(m), m)
m

with
˜
• the reduced functional J.
• the parameter m.

How do we solve this problem?


• Gradient descent.
• Newton method.
• Quasi-Newton methods.

5 / 12
Gradient descent
Algorithm
1 Choose initial parameter value m0 and γ > 0.
2 For i = 0, 1, . . . :
• mi+1 = mi − γ∇J(mi )

Features
+ Easy to implement.
– Slow convergence.
6 / 12
Newton method
˜
Optimisation problem: minm J(m).
Optimality condition:
˜
∇J(m) = 0. (1)

Newton method applied to (1):


1 Choose initial parameter value m0 .
2 For i = 0, 1, . . . :
• H(J)δm = −∇J(mi ), where H denotes the Hessian.
• mi+1 = mi + δm

Features
+ Fast (locally quadratic) convergence.
– Requires iteratively solving a linear system with
the Hessian, which might require many Hessian
action computations.
– Hessian might not be positive definite, resulting in
an update δm which is not a descent direction.
7 / 12
Newton method
˜
Optimisation problem: minm J(m).
Optimality condition:
˜
∇J(m) = 0. (1)

Newton method applied to (1):


1 Choose initial parameter value m0 .
2 For i = 0, 1, . . . :
• H(J)δm = −∇J(mi ), where H denotes the Hessian.
• mi+1 = mi + δm

Features
+ Fast (locally quadratic) convergence.
– Requires iteratively solving a linear system with
the Hessian, which might require many Hessian
action computations.
– Hessian might not be positive definite, resulting in
an update δm which is not a descent direction.
7 / 12
Newton method
˜
Optimisation problem: minm J(m).
Optimality condition:
˜
∇J(m) = 0. (1)

Newton method applied to (1):


1 Choose initial parameter value m0 .
2 For i = 0, 1, . . . :
• H(J)δm = −∇J(mi ), where H denotes the Hessian.
• mi+1 = mi + δm

Features
+ Fast (locally quadratic) convergence.
– Requires iteratively solving a linear system with
the Hessian, which might require many Hessian
action computations.
– Hessian might not be positive definite, resulting in
an update δm which is not a descent direction.
7 / 12
Newton method
˜
Optimisation problem: minm J(m).
Optimality condition:
˜
∇J(m) = 0. (1)

Newton method applied to (1):


1 Choose initial parameter value m0 .
2 For i = 0, 1, . . . :
• H(J)δm = −∇J(mi ), where H denotes the Hessian.
• mi+1 = mi + δm

Features
+ Fast (locally quadratic) convergence.
– Requires iteratively solving a linear system with
the Hessian, which might require many Hessian
action computations.
– Hessian might not be positive definite, resulting in
an update δm which is not a descent direction.
7 / 12
Quasi-Newton methods

Like Newton method, but use approximate, low-rank Hessian


approximation using gradient information only. A common
approximation method is BFGS.
Features
+ Robust: Hessian approximation is always positive
definite.
+ Cheap: No Hessian computation required, only
gradient computations.
– Only superlinear convergence rate.

8 / 12
Quasi-Newton methods

Like Newton method, but use approximate, low-rank Hessian


approximation using gradient information only. A common
approximation method is BFGS.
Features
+ Robust: Hessian approximation is always positive
definite.
+ Cheap: No Hessian computation required, only
gradient computations.
– Only superlinear convergence rate.

8 / 12
Solving the optimal Poisson problem
Python code
from fenics import *
from dolfin_adjoint import *

# Solve Poisson problem


# ...

J = Functional ( inner (s , s ) * dx )
m = SteadyParameter ( f )

rf = Red ucedFu nctional (J , m )


m_opt = minimize ( rf , method = "L - BFGS - B " , tol = 1e - 2 )

Tipps
• You can call print optimization methods() to list all
available methods.
• Use maximize if you want to solve a maximisation
problem.
9 / 12
Solving the optimal Poisson problem
Python code
from fenics import *
from dolfin_adjoint import *

# Solve Poisson problem


# ...

J = Functional ( inner (s , s ) * dx )
m = SteadyParameter ( f )

rf = Red ucedFu nctional (J , m )


m_opt = minimize ( rf , method = "L - BFGS - B " , tol = 1e - 2 )

Tipps
• You can call print optimization methods() to list all
available methods.
• Use maximize if you want to solve a maximisation
problem.
9 / 12
Bound constraints

Sometimes it is usefull to specify lower and upper bounds for


parameters:
lb ≤ m ≤ ub . (2)
Example:
Python code
lb = interpolate (0 , V )
ub = interpolate ( Expression ( " x [ 0 ] " ) , V )
m_opt = minimize ( rf , method = "L - BFGS - B " ,
bounds = [ lb , ub ] )

Note: Not all optimisation algorithms support bound


constraints.

10 / 12
Inequality constraints

Sometimes it is usefull to specify (in-)equality constraints on


the parameters:
g(m) ≤ 0. (3)
You can do that by overloading the InequalityConstraint
class.
For more information visit the Example section on
dolfin-adjoint.org.

11 / 12
The FEniCS challenge!

1 Solve the ”Hello world” PDE-constrained optimisation


problem on the unit square with ud (x, y) = sin(πx) sin(πy),
homogenous boundary conditions and α = 10−6 .
2 Compute the difference between optimised heat profile and
ud before and after the optimisation.
3 Use the optimisation algorithms SLSQP, Newton-CG and
L-BFGS-B and compare them.
4 What happens if you increase α?

12 / 12

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy