Physics-Informed Deep Learning
Physics-Informed Deep Learning
Paris Perdikaris
April 7, 2020
https://github.com/PredictiveIntelligenceLab/USNCCM15-Short-Course-Recent-Advances-in-
Physics-Informed-Deep-Learning
ML
Hypothesis:
Challenges:
Data
• Robust design/control
• Large parameter spaces
• Uncertainty quantification
• High cost of data acquisition
sha1_base64="VgbQGL+ueBIfLkPR9lh4eDChtBQ=">AAACM3icdVDLSgNBEJz1Gd9Rj14Gg5Bcwq4aH3gJ6sFjBJMI2RBmJxMzZHZ3mOkVwiYf40f4DV71KJ4Ur/6Ds9mIRrSgoajqprvLk4JrsO1na2p6ZnZuPrOwuLS8srqWXd+o6TBSlFVpKEJ17RHNBA9YFTgIdi0VI74nWN3rnSV+/ZYpzcPgCvqSNX1yE/AOpwSM1MqeyLwLXQZk4PoEupSI+HxYcKUKJYTYmN/qIG0sfE0UWtmcXXR2D/YOHZyS49KYHJSwU7RHyKExKq3sq9sOaeSzAKggWjccW0IzJgo4FWy46EaaSUJ75IY1DA2Iz3QzHj05xDtGaeNOqEwFgEfqz4mY+Fr3fc90Jjfr314i/uU1IugcNWMeyAhYQNNFnUhg83+SGG5zxSiIviGEKm5uxbRLFKFgcp3Y4vlDk8nX4/h/UtstOiaoy/1c+XScTgZtoW2URw46RGV0gSqoiii6Qw/oET1Z99aL9Wa9p61T1nhmE03A+vgER96saQ==</latexit>
sha1_base64="VgbQGL+ueBIfLkPR9lh4eDChtBQ=">AAACM3icdVDLSgNBEJz1Gd9Rj14Gg5Bcwq4aH3gJ6sFjBJMI2RBmJxMzZHZ3mOkVwiYf40f4DV71KJ4Ur/6Ds9mIRrSgoajqprvLk4JrsO1na2p6ZnZuPrOwuLS8srqWXd+o6TBSlFVpKEJ17RHNBA9YFTgIdi0VI74nWN3rnSV+/ZYpzcPgCvqSNX1yE/AOpwSM1MqeyLwLXQZk4PoEupSI+HxYcKUKJYTYmN/qIG0sfE0UWtmcXXR2D/YOHZyS49KYHJSwU7RHyKExKq3sq9sOaeSzAKggWjccW0IzJgo4FWy46EaaSUJ75IY1DA2Iz3QzHj05xDtGaeNOqEwFgEfqz4mY+Fr3fc90Jjfr314i/uU1IugcNWMeyAhYQNNFnUhg83+SGG5zxSiIviGEKm5uxbRLFKFgcp3Y4vlDk8nX4/h/UtstOiaoy/1c+XScTgZtoW2URw46RGV0gSqoiii6Qw/oET1Z99aL9Wa9p61T1nhmE03A+vgER96saQ==</latexit><latexit
sha1_base64="VgbQGL+ueBIfLkPR9lh4eDChtBQ=">AAACM3icdVDLSgNBEJz1Gd9Rj14Gg5Bcwq4aH3gJ6sFjBJMI2RBmJxMzZHZ3mOkVwiYf40f4DV71KJ4Ur/6Ds9mIRrSgoajqprvLk4JrsO1na2p6ZnZuPrOwuLS8srqWXd+o6TBSlFVpKEJ17RHNBA9YFTgIdi0VI74nWN3rnSV+/ZYpzcPgCvqSNX1yE/AOpwSM1MqeyLwLXQZk4PoEupSI+HxYcKUKJYTYmN/qIG0sfE0UWtmcXXR2D/YOHZyS49KYHJSwU7RHyKExKq3sq9sOaeSzAKggWjccW0IzJgo4FWy46EaaSUJ75IY1DA2Iz3QzHj05xDtGaeNOqEwFgEfqz4mY+Fr3fc90Jjfr314i/uU1IugcNWMeyAhYQNNFnUhg83+SGG5zxSiIviGEKm5uxbRLFKFgcp3Y4vlDk8nX4/h/UtstOiaoy/1c+XScTgZtoW2URw46RGV0gSqoiii6Qw/oET1Z99aL9Wa9p61T1nhmE03A+vgER96saQ==</latexit><latexit
sha1_base64="VgbQGL+ueBIfLkPR9lh4eDChtBQ=">AAACM3icdVDLSgNBEJz1Gd9Rj14Gg5Bcwq4aH3gJ6sFjBJMI2RBmJxMzZHZ3mOkVwiYf40f4DV71KJ4Ur/6Ds9mIRrSgoajqprvLk4JrsO1na2p6ZnZuPrOwuLS8srqWXd+o6TBSlFVpKEJ17RHNBA9YFTgIdi0VI74nWN3rnSV+/ZYpzcPgCvqSNX1yE/AOpwSM1MqeyLwLXQZk4PoEupSI+HxYcKUKJYTYmN/qIG0sfE0UWtmcXXR2D/YOHZyS49KYHJSwU7RHyKExKq3sq9sOaeSzAKggWjccW0IzJgo4FWy46EaaSUJ75IY1DA2Iz3QzHj05xDtGaeNOqEwFgEfqz4mY+Fr3fc90Jjfr314i/uU1IugcNWMeyAhYQNNFnUhg83+SGG5zxSiIviGEKm5uxbRLFKFgcp3Y4vlDk8nX4/h/UtstOiaoy/1c+XScTgZtoW2URw46RGV0gSqoiii6Qw/oET1Z99aL9Wa9p61T1nhmE03A+vgER96saQ==</latexit><latexit
<latexit
• Limited and high-dimensional data
p(✓|D) / p(D|✓)p(✓)
Motivation and open challenges
Prior knowledge
Goal: Predictive modeling, analysis and optimization of complex systems
• Incomplete models, imperfect data (e.g., missing data, outliers, complex noise processes)
• Multiple tasks and data modalities (e.g. images, time-series, scattered measurements, etc.)
CSE
• Can we bridge knowledge from scientific computing and machine learning to tackle these challenges?
output layer: N (x) = W N (x) + b 2 R ;
baked in specialized ✓ ◆ 3 2
@u @u @2u @2u
neural architectures with (2.1) f x;
@x1
,..., ;
@xd @x1 @x1
,...,
@x1 @xd
;...; = 0, x 2 ⌦, 4 5 6
N
conditions (BC) u(x, t) = gD (x, t) on D ⇢ the
since
u it clearly
@⌦different
and
expresses
@u that feature
@t
vectors
(x, t)of=thegreceptive
that
@x2
transform this
⇢
way express
| @⌦.that{z relationships betwee
} A# as th
⇡ ⇡ i
i=1constituents R (u, x, t) on
field. Note, inRparticular, The
if we initial
define
| restriction of
@n
the {z
adjacency matrix
condition (IC) is treated as a special type of boundary conditions. Tf and Tb Physics to P (i.e., }if P = denote the
(e , . . . , e ) thentwo sets Aof ), the
[A# ]
regularization = i i p1 pm Pi a,b
Pi
pa ,pb
A#Pi transforms exactly as Fi does in the equation above.
residual points for the equation and BC/IC. Data fit 8
sha1_base64="1HGZtmjbtzdRdyLbi+i+HT/d8uc=">AAADuHicbVJbb9MwFE5WLqPcNnjkxWJC2kSpknbAmFQxDSR4QKhM7CI1WXCck9Sa7US+oBXLv4/fsH+D2xVEth3Jysl3jr9z+Zw3jCodRRfhSufW7Tt3V+917z94+Ojx2vqTI1UbSeCQ1KyWJzlWwKiAQ001g5NGAuY5g+P87MM8fvwTpKK1+K5nDaQcV4KWlGDtoWztIuFYTwlm9ovbTPQUNN5CuyOUGFGAzCUmYJPSf2zs7NfMuEQZnjDKqVaZpaPYnS5gNElybo3L6Ksys5dEntBj5x7bSk8HLks0nGv7EWuMSqodenlzlYT59gvs/nV24CbXOLfSv3zj6UxRopCEyjAs6a/FZA5laxtRP1oYuu7ES2cjWNo4Ww9/J0VNDAehCcNKTeKo0anFUlPCwHUTo6DB5AxXMPGuwBxUahcaOPTCIwUqa+mP0GiB/n/DYq7UjOc+cz6XuhqbgzfFJkaXO6mlojEaBLksVBqGdI3mgqKCSiCazbyDiaS+V0Sm2O9Se9lbVXLemsESLAiw9lx4u8ENyJ6n15iN7BsqemiHCtfjWFZUjKK+/+umtoKag5azNqWRzKFuC5pPpOuaqXYmZlXtu53ygd9sNymg9C94sTU7BiEOoHD24NO+s/HrYS/qRTcl7TMDy6woinvR0J93set66eOrQl93jgb9eNgffNve2NtfPoLV4FnwPNgM4uBtsBd8DsbBYUDC9yGEIqw7u50fnapDL1NXwuWdp0HLOvIP9X86QQ==</latexit>
<latexit
(2.1) f x; ,..., ; ,..., ;...
Physics-informed neural networks for solving PDEs. @x1 We @x d @x1 @x1
consider @x1 @xd
where B(u, x) could be Dirichlet, Neumann, Robin, or periodic
wing PDE parameterized by for the solution u(x) with x = (x1 , . . . , xd )
boundary conditions.
on For time-dependent
a domain ⌦ ⇢ Rd : Physics-informed
problems,withwe suitable
considerNeural
time t as
boundary Networks
a special component of x, and
conditions
⌦✓contains the temporal2
domain.2
The initial
◆ condition can be simply treated as a
@u @u @ u @ u
special
f x; type
,..., of Dirichlet
; boundary
,..., condition
;...; on the
= 0, x 2 spatio-temporal
⌦, B(u, x) domain.
= 0 on @⌦,
@x1 @xd @x1 @x1 @x1 @xd
table boundary conditions PDE( ) B(u, x) could be Dirichlet, Neumann, Robin, or p
where
For time-dependent problems, we consider time t as a
B(u, x) = 0 @on @⌦,
⌦ contains the temporal domain.
@t T The initial condition
NN(x, t; ✓) @ û @ 2 û
f
specialortype
(u, x) could be Dirichlet, Neumann, Robin, of @t
periodic Dirichlet
boundary boundary condition on the spat
@x2 conditions.
@ 2 t as a special component of x, and
-dependent problems, we consider time
@x2
ns the temporal domain. The initial condition can be simply treated as a )
PDE(
ype ofxDirichlet boundary condition on the spatio-temporal domain.
.. .. û @ Minimize
t . PDE(
. ) I NN(x, t; ✓)û(x, t) gD (x, t) @t Loss ✓⇤ 2
@ û @ û
@t @x2
@
@t @ @ û Tf @2 Tb
@n (x, t) gR (u, x, t)
t; ✓) @ û
@n @ 2 û @x2
@t @x2
@ 2
x
@x2
BC & IC
..Automatic .. differentiation
û
û(x, t)1499-1511.
gD (x,
t
Psichogios, D. C., & Ungar, L. H. (1992). A hybrid neural network-first .
principles .
approach
I
to process modeling. AIChE Journal, 38(10),
.. .. û Minimize @u2
@ u
Fig. 1. Schematic I of a PINN û(x, for
t) solving
g (x, the
t) di↵usion Loss equation ✓ =
⇤ 2 with mixed boundary
Lagaris,
. I. .E., Likas, A., & Fotiadis, D. I. (1998). Artificial neural networks for solving ordinary and partial differential equations. IEEE transactions on
D @t @x
@u
conditions
neural (BC)
networks, 9(5), u(x, t) = gD (x, t) on D ⇢ @⌦ and @n (x, t) = gR (u,@x, t) on R @⇢û@⌦. The initial
987-1000.
T @n @n (x, t) gR (u, x
condition
Raissi, (IC)
M., Perdikaris, P., &isKarniadakis,
treated
@ as
G. E.a(2019).
@ ûspecial type of boundary
Physics-informed b conditions.
neural networks: A deep learning and Tbfordenote
Tfframework the two
solving forward sets of
and inverse
problems
residual involving nonlinear
points for@npartial differential
the equation
(x, t) g
@n equations. Journal
and BC/IC.
(u, x, t)
R of Computational Physics, 378, 686-707.
BC & equations. arXiv
Lu, L., Meng, X., Mao, Z., & Karniadakis, G. E. (2019). DeepXDE: A deep learning library for solving differential IC preprint arXiv:
1907.04502. BC & IC
sha1_base64="g3Odr6CDVtYyAkpjPR+5gn1uHt4=">AAACh3icbVFLa9tAEF6pryR9Oe0xl6F2IIXgSjmkOaa0hR5dqOOAZcxqNbIXr3bFPpoYod/Q39djf0TvHSs+5NGBZb6Z+YaZ/SavlXQ+SX5H8aPHT54+29nde/7i5avXvf03F84EK3AsjDL2MucOldQ49tIrvKwt8ipXOMlXnzf1yU+0Thr9w69rnFV8oWUpBfeUmvd+ZdpIXaD2MEGorRGIBeRr4DUF17Iinl7AIMurJrRHG3fdHvv3g44DBWINGoPlipy/MnYFg3LeZH6Jnt/mHwPXBfFLWhSoChadLAL1mbKLR1++AnfzXj8ZJp3BQ5BuQZ9tbTTv/ckKI0JFPxCKOzdNk9rPGm69FArbvSw4rLlY8QVOCWpeoZs1nXAtHFKmgNJYeqRAl73d0fDKuXWVE5OEWLr7tU3yf7Vp8OXZrJG6Dh61uBlUBgXewOYKUEiLwqs1AS6spF1BLLnlwtOt7kzJq5Y0Se8r8BBcnAzTZJh+P+mfn27V2WEH7B07Yin7yM7ZNzZiYybY3+ggGkSH8W78IT6Nz26ocbTtecvuWPzpH/1Vw+k=</latexit>
sha1_base64="g3Odr6CDVtYyAkpjPR+5gn1uHt4=">AAACh3icbVFLa9tAEF6pryR9Oe0xl6F2IIXgSjmkOaa0hR5dqOOAZcxqNbIXr3bFPpoYod/Q39djf0TvHSs+5NGBZb6Z+YaZ/SavlXQ+SX5H8aPHT54+29nde/7i5avXvf03F84EK3AsjDL2MucOldQ49tIrvKwt8ipXOMlXnzf1yU+0Thr9w69rnFV8oWUpBfeUmvd+ZdpIXaD2MEGorRGIBeRr4DUF17Iinl7AIMurJrRHG3fdHvv3g44DBWINGoPlipy/MnYFg3LeZH6Jnt/mHwPXBfFLWhSoChadLAL1mbKLR1++AnfzXj8ZJp3BQ5BuQZ9tbTTv/ckKI0JFPxCKOzdNk9rPGm69FArbvSw4rLlY8QVOCWpeoZs1nXAtHFKmgNJYeqRAl73d0fDKuXWVE5OEWLr7tU3yf7Vp8OXZrJG6Dh61uBlUBgXewOYKUEiLwqs1AS6spF1BLLnlwtOt7kzJq5Y0Se8r8BBcnAzTZJh+P+mfn27V2WEH7B07Yin7yM7ZNzZiYybY3+ggGkSH8W78IT6Nz26ocbTtecvuWPzpH/1Vw+k=</latexit><latexit
sha1_base64="g3Odr6CDVtYyAkpjPR+5gn1uHt4=">AAACh3icbVFLa9tAEF6pryR9Oe0xl6F2IIXgSjmkOaa0hR5dqOOAZcxqNbIXr3bFPpoYod/Q39djf0TvHSs+5NGBZb6Z+YaZ/SavlXQ+SX5H8aPHT54+29nde/7i5avXvf03F84EK3AsjDL2MucOldQ49tIrvKwt8ipXOMlXnzf1yU+0Thr9w69rnFV8oWUpBfeUmvd+ZdpIXaD2MEGorRGIBeRr4DUF17Iinl7AIMurJrRHG3fdHvv3g44DBWINGoPlipy/MnYFg3LeZH6Jnt/mHwPXBfFLWhSoChadLAL1mbKLR1++AnfzXj8ZJp3BQ5BuQZ9tbTTv/ckKI0JFPxCKOzdNk9rPGm69FArbvSw4rLlY8QVOCWpeoZs1nXAtHFKmgNJYeqRAl73d0fDKuXWVE5OEWLr7tU3yf7Vp8OXZrJG6Dh61uBlUBgXewOYKUEiLwqs1AS6spF1BLLnlwtOt7kzJq5Y0Se8r8BBcnAzTZJh+P+mfn27V2WEH7B07Yin7yM7ZNzZiYybY3+ggGkSH8W78IT6Nz26ocbTtecvuWPzpH/1Vw+k=</latexit><latexit
sha1_base64="g3Odr6CDVtYyAkpjPR+5gn1uHt4=">AAACh3icbVFLa9tAEF6pryR9Oe0xl6F2IIXgSjmkOaa0hR5dqOOAZcxqNbIXr3bFPpoYod/Q39djf0TvHSs+5NGBZb6Z+YaZ/SavlXQ+SX5H8aPHT54+29nde/7i5avXvf03F84EK3AsjDL2MucOldQ49tIrvKwt8ipXOMlXnzf1yU+0Thr9w69rnFV8oWUpBfeUmvd+ZdpIXaD2MEGorRGIBeRr4DUF17Iinl7AIMurJrRHG3fdHvv3g44DBWINGoPlipy/MnYFg3LeZH6Jnt/mHwPXBfFLWhSoChadLAL1mbKLR1++AnfzXj8ZJp3BQ5BuQZ9tbTTv/ckKI0JFPxCKOzdNk9rPGm69FArbvSw4rLlY8QVOCWpeoZs1nXAtHFKmgNJYeqRAl73d0fDKuXWVE5OEWLr7tU3yf7Vp8OXZrJG6Dh61uBlUBgXewOYKUEiLwqs1AS6spF1BLLnlwtOt7kzJq5Y0Se8r8BBcnAzTZJh+P+mfn27V2WEH7B07Yin7yM7ZNzZiYybY3+ggGkSH8W78IT6Nz26ocbTtecvuWPzpH/1Vw+k=</latexit><latexit
<latexit sha1_base64="k4K1V+WdTsjhgiTKcGzkuIuAYBU=">AAAC2nicbVFLbxMxEPYur1JeAY5cLFKkRIJotwfgWAmQyqUKEmkrJVE068wmVvxYPDZtFOXCDXHlz3GEX4Kd5kBbRrI872/mm6pRknxR/MryGzdv3b6zc3f33v0HDx+1Hj85JhucwIGwyrrTCgiVNDjw0is8bRyCrhSeVIt3KX7yFR1Jaz77ZYNjDTMjaynAR9ek9WdkrDRTNJ7350uSgl5JU1unccoNBgcqfv7MugXxTv/j0RF1OUjNwfOYh85JM+PAhTVemmADcQU+dauDEQmC740qvQrrTvrO1y99d4/7eSwHJwmJA0UTOVkVNunexnacluRRc1tzY03aDRxvwHkZ55nKOgJHjGTgl7DZJE33/kM3VaR2MzSYZk+bTFrtoldshF9Xyq3SZlvpT1q/R1Mrgo4QQgHRsCwaP14leKFwvTsKhA2IBcxwGFUDGmm82txizV9EzzThxhdp2Hj/rViBJlrqKmZq8HO6GkvO/8WGwddvxytpmhD5FRdAdVCJsHTYSItD4dUyKiCcjLNyMQcHwsfzX0Kp9DpyUl5l4LpyvN8ri175ab998HrLzg57xp6zDivZG3bADlmfDZjIDjOTnWXn+Sj/ln/Pf1yk5tm25im7JPnPvzcg5cE=</latexit>
sha1_base64="k4K1V+WdTsjhgiTKcGzkuIuAYBU=">AAAC2nicbVFLbxMxEPYur1JeAY5cLFKkRIJotwfgWAmQyqUKEmkrJVE068wmVvxYPDZtFOXCDXHlz3GEX4Kd5kBbRrI872/mm6pRknxR/MryGzdv3b6zc3f33v0HDx+1Hj85JhucwIGwyrrTCgiVNDjw0is8bRyCrhSeVIt3KX7yFR1Jaz77ZYNjDTMjaynAR9ek9WdkrDRTNJ7350uSgl5JU1unccoNBgcqfv7MugXxTv/j0RF1OUjNwfOYh85JM+PAhTVemmADcQU+dauDEQmC740qvQrrTvrO1y99d4/7eSwHJwmJA0UTOVkVNunexnacluRRc1tzY03aDRxvwHkZ55nKOgJHjGTgl7DZJE33/kM3VaR2MzSYZk+bTFrtoldshF9Xyq3SZlvpT1q/R1Mrgo4QQgHRsCwaP14leKFwvTsKhA2IBcxwGFUDGmm82txizV9EzzThxhdp2Hj/rViBJlrqKmZq8HO6GkvO/8WGwddvxytpmhD5FRdAdVCJsHTYSItD4dUyKiCcjLNyMQcHwsfzX0Kp9DpyUl5l4LpyvN8ri175ab998HrLzg57xp6zDivZG3bADlmfDZjIDjOTnWXn+Sj/ln/Pf1yk5tm25im7JPnPvzcg5cE=</latexit><latexit
sha1_base64="k4K1V+WdTsjhgiTKcGzkuIuAYBU=">AAAC2nicbVFLbxMxEPYur1JeAY5cLFKkRIJotwfgWAmQyqUKEmkrJVE068wmVvxYPDZtFOXCDXHlz3GEX4Kd5kBbRrI872/mm6pRknxR/MryGzdv3b6zc3f33v0HDx+1Hj85JhucwIGwyrrTCgiVNDjw0is8bRyCrhSeVIt3KX7yFR1Jaz77ZYNjDTMjaynAR9ek9WdkrDRTNJ7350uSgl5JU1unccoNBgcqfv7MugXxTv/j0RF1OUjNwfOYh85JM+PAhTVemmADcQU+dauDEQmC740qvQrrTvrO1y99d4/7eSwHJwmJA0UTOVkVNunexnacluRRc1tzY03aDRxvwHkZ55nKOgJHjGTgl7DZJE33/kM3VaR2MzSYZk+bTFrtoldshF9Xyq3SZlvpT1q/R1Mrgo4QQgHRsCwaP14leKFwvTsKhA2IBcxwGFUDGmm82txizV9EzzThxhdp2Hj/rViBJlrqKmZq8HO6GkvO/8WGwddvxytpmhD5FRdAdVCJsHTYSItD4dUyKiCcjLNyMQcHwsfzX0Kp9DpyUl5l4LpyvN8ri175ab998HrLzg57xp6zDivZG3bADlmfDZjIDjOTnWXn+Sj/ln/Pf1yk5tm25im7JPnPvzcg5cE=</latexit><latexit
sha1_base64="k4K1V+WdTsjhgiTKcGzkuIuAYBU=">AAAC2nicbVFLbxMxEPYur1JeAY5cLFKkRIJotwfgWAmQyqUKEmkrJVE068wmVvxYPDZtFOXCDXHlz3GEX4Kd5kBbRrI872/mm6pRknxR/MryGzdv3b6zc3f33v0HDx+1Hj85JhucwIGwyrrTCgiVNDjw0is8bRyCrhSeVIt3KX7yFR1Jaz77ZYNjDTMjaynAR9ek9WdkrDRTNJ7350uSgl5JU1unccoNBgcqfv7MugXxTv/j0RF1OUjNwfOYh85JM+PAhTVemmADcQU+dauDEQmC740qvQrrTvrO1y99d4/7eSwHJwmJA0UTOVkVNunexnacluRRc1tzY03aDRxvwHkZ55nKOgJHjGTgl7DZJE33/kM3VaR2MzSYZk+bTFrtoldshF9Xyq3SZlvpT1q/R1Mrgo4QQgHRsCwaP14leKFwvTsKhA2IBcxwGFUDGmm82txizV9EzzThxhdp2Hj/rViBJlrqKmZq8HO6GkvO/8WGwddvxytpmhD5FRdAdVCJsHTYSItD4dUyKiCcjLNyMQcHwsfzX0Kp9DpyUl5l4LpyvN8ri175ab998HrLzg57xp6zDivZG3bADlmfDZjIDjOTnWXn+Sj/ln/Pf1yk5tm25im7JPnPvzcg5cE=</latexit><latexit
<latexit
sha1_base64="Zt/H4maZJMObw7Ak7O3ilQRIvyw=">AAADWXichVJdaxQxFM3sqF3Hr6199CW4CBVhmemDSkEotoKCDyt028LOsiaZO7OhmWTIR3EZ5vf5G8Qnf4Cv+mpmO5R+rHohcHLvObnJyaWV4MbG8begF966fWejfze6d//Bw0eDzcdHRjnNYMKUUPqEEgOCS5hYbgWcVBpISQUc09P9tn58BtpwJQ/tsoJZSQrJc86I9an5ZvA5lYrLDKTFhwvATGkNplIy47LAQhmDcydZS8bc4IKfgcR0GaUUCi5rInghmwj7SEtiF4yI+mOzndoFWPJ8903q/NGaasKgvkSY1+6C5DephS8W1wfEEpxzi5vmxV+Feo1wfPAO+1vzzBHxT7Gb13Gz5oAP++a/jb2WrtO+vdBib6V0JQUdpSCzzpz5YBiP4lXgmyDpwBB1MZ4PfqSZYq70X8IEMWaaxJWd1URbzgQ0UeoMVISdkgKmHkpSgpnVq1Fo8DOfyXCutF/+S1fZy4qalMYsS+qZ7ePM9VqbXFebOpu/ntVcVs6CZOeNciewVbidK5xxDcyKpQeEae7vitmCeButn74rXWjZepJcd+AmONoZJfEo+bQz3HvZudNHT9BTtI0S9ArtofdojCaIBV+Dn8Gv4HfvexiE/TA6p/aCTrOFrkS49QcieBnC</latexit>
sha1_base64="Zt/H4maZJMObw7Ak7O3ilQRIvyw=">AAADWXichVJdaxQxFM3sqF3Hr6199CW4CBVhmemDSkEotoKCDyt028LOsiaZO7OhmWTIR3EZ5vf5G8Qnf4Cv+mpmO5R+rHohcHLvObnJyaWV4MbG8begF966fWejfze6d//Bw0eDzcdHRjnNYMKUUPqEEgOCS5hYbgWcVBpISQUc09P9tn58BtpwJQ/tsoJZSQrJc86I9an5ZvA5lYrLDKTFhwvATGkNplIy47LAQhmDcydZS8bc4IKfgcR0GaUUCi5rInghmwj7SEtiF4yI+mOzndoFWPJ8903q/NGaasKgvkSY1+6C5DephS8W1wfEEpxzi5vmxV+Feo1wfPAO+1vzzBHxT7Gb13Gz5oAP++a/jb2WrtO+vdBib6V0JQUdpSCzzpz5YBiP4lXgmyDpwBB1MZ4PfqSZYq70X8IEMWaaxJWd1URbzgQ0UeoMVISdkgKmHkpSgpnVq1Fo8DOfyXCutF/+S1fZy4qalMYsS+qZ7ePM9VqbXFebOpu/ntVcVs6CZOeNciewVbidK5xxDcyKpQeEae7vitmCeButn74rXWjZepJcd+AmONoZJfEo+bQz3HvZudNHT9BTtI0S9ArtofdojCaIBV+Dn8Gv4HfvexiE/TA6p/aCTrOFrkS49QcieBnC</latexit><latexit
sha1_base64="Zt/H4maZJMObw7Ak7O3ilQRIvyw=">AAADWXichVJdaxQxFM3sqF3Hr6199CW4CBVhmemDSkEotoKCDyt028LOsiaZO7OhmWTIR3EZ5vf5G8Qnf4Cv+mpmO5R+rHohcHLvObnJyaWV4MbG8begF966fWejfze6d//Bw0eDzcdHRjnNYMKUUPqEEgOCS5hYbgWcVBpISQUc09P9tn58BtpwJQ/tsoJZSQrJc86I9an5ZvA5lYrLDKTFhwvATGkNplIy47LAQhmDcydZS8bc4IKfgcR0GaUUCi5rInghmwj7SEtiF4yI+mOzndoFWPJ8903q/NGaasKgvkSY1+6C5DephS8W1wfEEpxzi5vmxV+Feo1wfPAO+1vzzBHxT7Gb13Gz5oAP++a/jb2WrtO+vdBib6V0JQUdpSCzzpz5YBiP4lXgmyDpwBB1MZ4PfqSZYq70X8IEMWaaxJWd1URbzgQ0UeoMVISdkgKmHkpSgpnVq1Fo8DOfyXCutF/+S1fZy4qalMYsS+qZ7ePM9VqbXFebOpu/ntVcVs6CZOeNciewVbidK5xxDcyKpQeEae7vitmCeButn74rXWjZepJcd+AmONoZJfEo+bQz3HvZudNHT9BTtI0S9ArtofdojCaIBV+Dn8Gv4HfvexiE/TA6p/aCTrOFrkS49QcieBnC</latexit><latexit
sha1_base64="Zt/H4maZJMObw7Ak7O3ilQRIvyw=">AAADWXichVJdaxQxFM3sqF3Hr6199CW4CBVhmemDSkEotoKCDyt028LOsiaZO7OhmWTIR3EZ5vf5G8Qnf4Cv+mpmO5R+rHohcHLvObnJyaWV4MbG8begF966fWejfze6d//Bw0eDzcdHRjnNYMKUUPqEEgOCS5hYbgWcVBpISQUc09P9tn58BtpwJQ/tsoJZSQrJc86I9an5ZvA5lYrLDKTFhwvATGkNplIy47LAQhmDcydZS8bc4IKfgcR0GaUUCi5rInghmwj7SEtiF4yI+mOzndoFWPJ8903q/NGaasKgvkSY1+6C5DephS8W1wfEEpxzi5vmxV+Feo1wfPAO+1vzzBHxT7Gb13Gz5oAP++a/jb2WrtO+vdBib6V0JQUdpSCzzpz5YBiP4lXgmyDpwBB1MZ4PfqSZYq70X8IEMWaaxJWd1URbzgQ0UeoMVISdkgKmHkpSgpnVq1Fo8DOfyXCutF/+S1fZy4qalMYsS+qZ7ePM9VqbXFebOpu/ntVcVs6CZOeNciewVbidK5xxDcyKpQeEae7vitmCeButn74rXWjZepJcd+AmONoZJfEo+bQz3HvZudNHT9BTtI0S9ArtofdojCaIBV+Dn8Gv4HfvexiE/TA6p/aCTrOFrkS49QcieBnC</latexit><latexit
<latexit
sha1_base64="wymmur7ludfDE4q0FDa/gRgGDF4=">AAACinicbVFNa9wwEJXdr3Sbttv2WCiiS+m2JYu9hyZpKATaQw89pJBNAuvFjOVZr4g8MpIcWMz+ify7Hvsvcqy860CTdEDo6c0bzegpq5S0Lop+B+G9+w8ePtp63Huy/fTZ8/6LlydW10bgRGilzVkGFpUknDjpFJ5VBqHMFJ5m59/a/OkFGis1HbtlhbMSCpJzKcB5Ku1fJqQl5UiOHxuQJKngFxK4dVoswDopeGEgl60gRyv8/qWXZFhIakDJgj6ueolboIO0oU/xin/l3ZH4Dk884AlBpiDd0DwpwS0EqObnanit/NBLkPLr+9L+IBpF6+B3QdyBAeviKO3/SXIt6tLPJhRYO42jys0aMH56hX6+2mIF4hwKnHpIUKKdNWvvVvydZ3I+18Yv/8Y1+29FA6W1yzLzynZyezvXkv/LTWs335s1kqraIYlNo3mtuNO8/QieS4PCqaUHIIxsnfaOGxDOf9eNLlnZehLfduAuOBmP4mgU/xoPDj937myx1+wtG7KY7bJD9oMdsQkT7Cp4E7wPhuF2OA73w4ONNAy6mlfsRoTf/wJcl8ac</latexit>
sha1_base64="wymmur7ludfDE4q0FDa/gRgGDF4=">AAACinicbVFNa9wwEJXdr3Sbttv2WCiiS+m2JYu9hyZpKATaQw89pJBNAuvFjOVZr4g8MpIcWMz+ify7Hvsvcqy860CTdEDo6c0bzegpq5S0Lop+B+G9+w8ePtp63Huy/fTZ8/6LlydW10bgRGilzVkGFpUknDjpFJ5VBqHMFJ5m59/a/OkFGis1HbtlhbMSCpJzKcB5Ku1fJqQl5UiOHxuQJKngFxK4dVoswDopeGEgl60gRyv8/qWXZFhIakDJgj6ueolboIO0oU/xin/l3ZH4Dk884AlBpiDd0DwpwS0EqObnanit/NBLkPLr+9L+IBpF6+B3QdyBAeviKO3/SXIt6tLPJhRYO42jys0aMH56hX6+2mIF4hwKnHpIUKKdNWvvVvydZ3I+18Yv/8Y1+29FA6W1yzLzynZyezvXkv/LTWs335s1kqraIYlNo3mtuNO8/QieS4PCqaUHIIxsnfaOGxDOf9eNLlnZehLfduAuOBmP4mgU/xoPDj937myx1+wtG7KY7bJD9oMdsQkT7Cp4E7wPhuF2OA73w4ONNAy6mlfsRoTf/wJcl8ac</latexit><latexit
sha1_base64="wymmur7ludfDE4q0FDa/gRgGDF4=">AAACinicbVFNa9wwEJXdr3Sbttv2WCiiS+m2JYu9hyZpKATaQw89pJBNAuvFjOVZr4g8MpIcWMz+ify7Hvsvcqy860CTdEDo6c0bzegpq5S0Lop+B+G9+w8ePtp63Huy/fTZ8/6LlydW10bgRGilzVkGFpUknDjpFJ5VBqHMFJ5m59/a/OkFGis1HbtlhbMSCpJzKcB5Ku1fJqQl5UiOHxuQJKngFxK4dVoswDopeGEgl60gRyv8/qWXZFhIakDJgj6ueolboIO0oU/xin/l3ZH4Dk884AlBpiDd0DwpwS0EqObnanit/NBLkPLr+9L+IBpF6+B3QdyBAeviKO3/SXIt6tLPJhRYO42jys0aMH56hX6+2mIF4hwKnHpIUKKdNWvvVvydZ3I+18Yv/8Y1+29FA6W1yzLzynZyezvXkv/LTWs335s1kqraIYlNo3mtuNO8/QieS4PCqaUHIIxsnfaOGxDOf9eNLlnZehLfduAuOBmP4mgU/xoPDj937myx1+wtG7KY7bJD9oMdsQkT7Cp4E7wPhuF2OA73w4ONNAy6mlfsRoTf/wJcl8ac</latexit><latexit
sha1_base64="wymmur7ludfDE4q0FDa/gRgGDF4=">AAACinicbVFNa9wwEJXdr3Sbttv2WCiiS+m2JYu9hyZpKATaQw89pJBNAuvFjOVZr4g8MpIcWMz+ify7Hvsvcqy860CTdEDo6c0bzegpq5S0Lop+B+G9+w8ePtp63Huy/fTZ8/6LlydW10bgRGilzVkGFpUknDjpFJ5VBqHMFJ5m59/a/OkFGis1HbtlhbMSCpJzKcB5Ku1fJqQl5UiOHxuQJKngFxK4dVoswDopeGEgl60gRyv8/qWXZFhIakDJgj6ueolboIO0oU/xin/l3ZH4Dk884AlBpiDd0DwpwS0EqObnanit/NBLkPLr+9L+IBpF6+B3QdyBAeviKO3/SXIt6tLPJhRYO42jys0aMH56hX6+2mIF4hwKnHpIUKKdNWvvVvydZ3I+18Yv/8Y1+29FA6W1yzLzynZyezvXkv/LTWs335s1kqraIYlNo3mtuNO8/QieS4PCqaUHIIxsnfaOGxDOf9eNLlnZehLfduAuOBmP4mgU/xoPDj937myx1+wtG7KY7bJD9oMdsQkT7Cp4E7wPhuF2OA73w4ONNAy6mlfsRoTf/wJcl8ac</latexit><latexit
<latexit
sha1_base64="UohXjcXIJLY/cMMVIp1KZsCa4JM=">AAADBHicdVJNb9QwEHXCR8sW6BaOXEasQK1YrRKESi+VKnHhBEXqtpXiKHK83qxV24lsB3UV5drf0CucuSGu/A+O/BOcj8NuW8ayNDNv5j177LQQ3Ngg+OP59+4/eLix+Wiw9fjJ0+3hzrNTk5easinNRa7PU2KY4IpNLbeCnReaEZkKdpZefGjws69MG56rE7ssWCxJpvicU2JdKtnxtnDKMq4q49RsPYDGXuNUVmWdWHgDWBK7oERUn+qkavKXdR11eAyHEIwBuwUdApgrwJ8ly8gYbBtFruIkBozXqHe7+jEEe45k0Yd7HdnlCs1/G23TmK2EXe+q6B1HK4i2nIievGXGTM362yfDUTAJWoPbTtg7I9TbcTL8i2c5LSVTlgpiTBQGhY2rRoQKVg9waVhB6AXJWORcRSQzcdW+Wg2vXGYG81y7rSy02dWOikhjljJ1lc0bmJtYk7wLi0o7P4grrorSMkU7oXkpwObQfAGYcc2oFUvnEKq5OyvQBdGEWvdR1lRS2cwkvDmB287p20kYTMIv70ZH+/10NtEL9BLtohC9R0foIzpGU0Q97V1737zv/pX/w//p/+pKfa/veY7WzP/9D9v16Dw=</latexit>
sha1_base64="UohXjcXIJLY/cMMVIp1KZsCa4JM=">AAADBHicdVJNb9QwEHXCR8sW6BaOXEasQK1YrRKESi+VKnHhBEXqtpXiKHK83qxV24lsB3UV5drf0CucuSGu/A+O/BOcj8NuW8ayNDNv5j177LQQ3Ngg+OP59+4/eLix+Wiw9fjJ0+3hzrNTk5easinNRa7PU2KY4IpNLbeCnReaEZkKdpZefGjws69MG56rE7ssWCxJpvicU2JdKtnxtnDKMq4q49RsPYDGXuNUVmWdWHgDWBK7oERUn+qkavKXdR11eAyHEIwBuwUdApgrwJ8ly8gYbBtFruIkBozXqHe7+jEEe45k0Yd7HdnlCs1/G23TmK2EXe+q6B1HK4i2nIievGXGTM362yfDUTAJWoPbTtg7I9TbcTL8i2c5LSVTlgpiTBQGhY2rRoQKVg9waVhB6AXJWORcRSQzcdW+Wg2vXGYG81y7rSy02dWOikhjljJ1lc0bmJtYk7wLi0o7P4grrorSMkU7oXkpwObQfAGYcc2oFUvnEKq5OyvQBdGEWvdR1lRS2cwkvDmB287p20kYTMIv70ZH+/10NtEL9BLtohC9R0foIzpGU0Q97V1737zv/pX/w//p/+pKfa/veY7WzP/9D9v16Dw=</latexit><latexit
sha1_base64="UohXjcXIJLY/cMMVIp1KZsCa4JM=">AAADBHicdVJNb9QwEHXCR8sW6BaOXEasQK1YrRKESi+VKnHhBEXqtpXiKHK83qxV24lsB3UV5drf0CucuSGu/A+O/BOcj8NuW8ayNDNv5j177LQQ3Ngg+OP59+4/eLix+Wiw9fjJ0+3hzrNTk5easinNRa7PU2KY4IpNLbeCnReaEZkKdpZefGjws69MG56rE7ssWCxJpvicU2JdKtnxtnDKMq4q49RsPYDGXuNUVmWdWHgDWBK7oERUn+qkavKXdR11eAyHEIwBuwUdApgrwJ8ly8gYbBtFruIkBozXqHe7+jEEe45k0Yd7HdnlCs1/G23TmK2EXe+q6B1HK4i2nIievGXGTM362yfDUTAJWoPbTtg7I9TbcTL8i2c5LSVTlgpiTBQGhY2rRoQKVg9waVhB6AXJWORcRSQzcdW+Wg2vXGYG81y7rSy02dWOikhjljJ1lc0bmJtYk7wLi0o7P4grrorSMkU7oXkpwObQfAGYcc2oFUvnEKq5OyvQBdGEWvdR1lRS2cwkvDmB287p20kYTMIv70ZH+/10NtEL9BLtohC9R0foIzpGU0Q97V1737zv/pX/w//p/+pKfa/veY7WzP/9D9v16Dw=</latexit><latexit
sha1_base64="UohXjcXIJLY/cMMVIp1KZsCa4JM=">AAADBHicdVJNb9QwEHXCR8sW6BaOXEasQK1YrRKESi+VKnHhBEXqtpXiKHK83qxV24lsB3UV5drf0CucuSGu/A+O/BOcj8NuW8ayNDNv5j177LQQ3Ngg+OP59+4/eLix+Wiw9fjJ0+3hzrNTk5easinNRa7PU2KY4IpNLbeCnReaEZkKdpZefGjws69MG56rE7ssWCxJpvicU2JdKtnxtnDKMq4q49RsPYDGXuNUVmWdWHgDWBK7oERUn+qkavKXdR11eAyHEIwBuwUdApgrwJ8ly8gYbBtFruIkBozXqHe7+jEEe45k0Yd7HdnlCs1/G23TmK2EXe+q6B1HK4i2nIievGXGTM362yfDUTAJWoPbTtg7I9TbcTL8i2c5LSVTlgpiTBQGhY2rRoQKVg9waVhB6AXJWORcRSQzcdW+Wg2vXGYG81y7rSy02dWOikhjljJ1lc0bmJtYk7wLi0o7P4grrorSMkU7oXkpwObQfAGYcc2oFUvnEKq5OyvQBdGEWvdR1lRS2cwkvDmB287p20kYTMIv70ZH+/10NtEL9BLtohC9R0foIzpGU0Q97V1737zv/pX/w//p/+pKfa/veY7WzP/9D9v16Dw=</latexit><latexit
<latexit
sha1_base64="yY03LYg6yHAHT3t1MtH47E/vel8=">AAACcnicbVHRShwxFM2MVldbddQ3fYkuBUVZZqS0RRAWfPGpWHBV2BmWO9mMG0xmhuSOdDvkQ33Tf+gHNLMdpLpeCJyck5N7c5KWUhgMw0fPX1j8sLTcWVn9+GltfSPY3Lo2RaUZH7BCFvo2BcOlyPkABUp+W2oOKpX8Jr0/b/SbB66NKPIrnJY8UXCXi0wwQEeNgt9xqmptR3WME45gD5r9L3tM8ZCentE408DquASNAqR9QRRtNudxliMaK8AJA1n/aC6dCXb43tlkFHTDXjgrOg+iFnRJW5ej4DkeF6xSPEcmwZhhFJaY1M1ETHK7GleGl8Du4Y4PHcxBcZPUs4ws/eyYMc0K7VaOdMb+76hBGTNVqTvZvMC81RryPW1YYfY9qUVeVshz9q9RVrmECtoETsdCc4Zy6gAwLdyslE3AxYruW151SZV1mURvE5gH1ye9KOxFP790+1/bdDpkl+yTAxKRb6RPLsglGRBGnrwlb8MLvD/+jr/nt1H6XuvZJq/KP/4L3xG/1g==</latexit>
sha1_base64="yY03LYg6yHAHT3t1MtH47E/vel8=">AAACcnicbVHRShwxFM2MVldbddQ3fYkuBUVZZqS0RRAWfPGpWHBV2BmWO9mMG0xmhuSOdDvkQ33Tf+gHNLMdpLpeCJyck5N7c5KWUhgMw0fPX1j8sLTcWVn9+GltfSPY3Lo2RaUZH7BCFvo2BcOlyPkABUp+W2oOKpX8Jr0/b/SbB66NKPIrnJY8UXCXi0wwQEeNgt9xqmptR3WME45gD5r9L3tM8ZCentE408DquASNAqR9QRRtNudxliMaK8AJA1n/aC6dCXb43tlkFHTDXjgrOg+iFnRJW5ej4DkeF6xSPEcmwZhhFJaY1M1ETHK7GleGl8Du4Y4PHcxBcZPUs4ws/eyYMc0K7VaOdMb+76hBGTNVqTvZvMC81RryPW1YYfY9qUVeVshz9q9RVrmECtoETsdCc4Zy6gAwLdyslE3AxYruW151SZV1mURvE5gH1ye9KOxFP790+1/bdDpkl+yTAxKRb6RPLsglGRBGnrwlb8MLvD/+jr/nt1H6XuvZJq/KP/4L3xG/1g==</latexit><latexit
sha1_base64="yY03LYg6yHAHT3t1MtH47E/vel8=">AAACcnicbVHRShwxFM2MVldbddQ3fYkuBUVZZqS0RRAWfPGpWHBV2BmWO9mMG0xmhuSOdDvkQ33Tf+gHNLMdpLpeCJyck5N7c5KWUhgMw0fPX1j8sLTcWVn9+GltfSPY3Lo2RaUZH7BCFvo2BcOlyPkABUp+W2oOKpX8Jr0/b/SbB66NKPIrnJY8UXCXi0wwQEeNgt9xqmptR3WME45gD5r9L3tM8ZCentE408DquASNAqR9QRRtNudxliMaK8AJA1n/aC6dCXb43tlkFHTDXjgrOg+iFnRJW5ej4DkeF6xSPEcmwZhhFJaY1M1ETHK7GleGl8Du4Y4PHcxBcZPUs4ws/eyYMc0K7VaOdMb+76hBGTNVqTvZvMC81RryPW1YYfY9qUVeVshz9q9RVrmECtoETsdCc4Zy6gAwLdyslE3AxYruW151SZV1mURvE5gH1ye9KOxFP790+1/bdDpkl+yTAxKRb6RPLsglGRBGnrwlb8MLvD/+jr/nt1H6XuvZJq/KP/4L3xG/1g==</latexit><latexit
sha1_base64="yY03LYg6yHAHT3t1MtH47E/vel8=">AAACcnicbVHRShwxFM2MVldbddQ3fYkuBUVZZqS0RRAWfPGpWHBV2BmWO9mMG0xmhuSOdDvkQ33Tf+gHNLMdpLpeCJyck5N7c5KWUhgMw0fPX1j8sLTcWVn9+GltfSPY3Lo2RaUZH7BCFvo2BcOlyPkABUp+W2oOKpX8Jr0/b/SbB66NKPIrnJY8UXCXi0wwQEeNgt9xqmptR3WME45gD5r9L3tM8ZCentE408DquASNAqR9QRRtNudxliMaK8AJA1n/aC6dCXb43tlkFHTDXjgrOg+iFnRJW5ej4DkeF6xSPEcmwZhhFJaY1M1ETHK7GleGl8Du4Y4PHcxBcZPUs4ws/eyYMc0K7VaOdMb+76hBGTNVqTvZvMC81RryPW1YYfY9qUVeVshz9q9RVrmECtoETsdCc4Zy6gAwLdyslE3AxYruW151SZV1mURvE5gH1ye9KOxFP790+1/bdDpkl+yTAxKRb6RPLsglGRBGnrwlb8MLvD/+jr/nt1H6XuvZJq/KP/4L3xG/1g==</latexit><latexit
<latexit
Data fit
| {z }
define the residual of the PDE as
L(✓) := Lu (✓) +
@
@t
✓n+1 = ✓n
Training via stochastic gradient descent:
Lr (✓)
| {z }
The corresponding loss function is given by
u(x, 0) = h(x), x 2 ⌦
PDE residual
di↵erential equations (PDE) of the general form
⌘r✓ L(✓n )
ICs fit
ut + Nx [u] = 0, x 2 ⌦, t 2 [0, T ]
BCs fit
+ Lu0 (✓) + Lub (✓)
| {z } | {z }
and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378, 686-707.
Raissi, M., Perdikaris, P., & Karniadakis, G. E. (2019). Physics-informed neural networks: A deep learning framework for solving forward
Physics-informed neural networks (PINNs) aim at inferring a continuous latent
function u(x, t) that arises as the solution to a system of nonlinear partial
The shared parameters between the neural networks u(t, x) and f (t, x) can
be learned by minimizing the mean squared error loss
where
Nu
X
1
M SEu = |u(tiu , xiu ) ui | 2 ,
Nu i=1
5
and
Nf
1 X
M SEf = |f (tif , xif )|2 .
Nf i=1
Nf
and {tif , xif }i=1 specify the collocations points for f (t, x). The loss M SEu
corresponds to the initial and boundary data while M SEf enforces the struc-
ture imposed by equation (3) at a finite set of collocation points.
u(t, x)
u(t, x)
0 0 0
1 1 1
1 0 1 1 0 1 1 0 1
x x x
Exact Prediction
Figure 1: Burgers’ equation: Top: Predicted solution u(t, x) along with the initial and
boundary training data. In addition we are using 10,000 collocation points generated using
a Latin Hypercube Sampling strategy. Bottom: Comparison of the predicted and exact
solutions corresponding to the three temporal snapshots depicted by the white vertical
lines in the top panel. The relative L2 error for this case is 6.7 · 10 4 . Model training took
approximately 60 seconds on a single NVIDIA Titan X GPU card.
Physics-informed Neural Networks
Physics-informed neural networks
Raissi, M.,Yazdani, A., & Karniadakis, G. E. (2020). Hidden fluid mechanics: Learning velocity and pressure
fields from flow visualizations. Science.
neral di↵erential operator, u(s) are the field variables of
racy and generalizability.
he source field, and K(s) denotes an input property field
eneral
m’s di↵erential behavior.
constitutive BExtensions
operator, u(s)is are
thethe to CNNs
field variables
operator of
for boundary and GCNs Differentiable Physics-informed Graph Networks
T(x, y) = m
2.4. Application to Surfaces From '
Electro-Anatomic Mapping V(x, y) =
During electro-anatomic
rmed neural networks for activation mapping. We use two neural networks to approximate the activation time T and the conduction velocity mapping, data can only be acquired
on the cardiac
with a loss function that accounts for the similarity between the output of the network and the data, the physics of the problem using the surface, either of the ventricles or the atria. We
e regularization terms. thus represent the resulting map as a surface in three dimensions with x, y ∈ [0
and neglect the thickness of the atrial wall. This is a reasonable of the activati
assumption since the thickness-to-diameter ratio of the atria is in N = 50 samp
the order of 0.05. Our assumption implies that the electrical wave model. We onl
can only travel along the surface and not perpendicular to it. To both the activa
zation problem to train the neural networks with different prior functions defined by the parameters θ̃ , θ̃ V , we include an additional loss term:
account for thisTconstraint, hidden layers w
parameters: which we randomly sample with Glorot initialization [20]. network and 5
NR
Additionally, we perturb our data with Gaussian noise Lwith 1 ! " #2 velocity neural
N = αN ∇T(xi ) · N i (10)
2 NR and αL2 and th
arg min L(θ T , θ V ) (7) variance σN to train each network of the ensemble with a slightly i=1
while keeping
θ T ,θ V
! "
different dataset. Our final prediction is obtained as the
This form mean
favors solutions where the activation time gradients for 50,000 AD
output of the ensemble of neural networks. are orthogonal to the surface normal N i . To implement this then train with
ty Quantification constraint, we assume a triangular discretization of either
the left or right atrium, which we obtain from magnetic
We compar
a neural netw
edSahli
in quantifying the uncertainty in our
Costabal, F.,Yang,Y., Perdikaris, P., Hurtado, D. E., & 2.3. Active Learning resonance imaging or computed-tomography imaging. We can except withou
inform FIGURE 6 | Benchmark problem and active learning. We perform 30 simulations of active learning with different initialasamples and compare
N i for them
each against
triangle.aWe
Latin
Kuhl, E.physicians
(2020). about the quality
Physics-informed neuralofnetworks
We for
take advantage of the uncertainty estimates
hypercube design. In the middle and right box plots, we observe a significant reduction in activation time
then compute
described
normalized
insurface
root
the
mean
normal
squared error (p < 10 −7
) and in
the NR collocation points as the centroids of each triangle in the
define and Gaussian
without physic
ll cardiac
as to use active
activation learning techniques intoPhysics, 8,
mapping. Frontiers previous 42.(psection to create anactive
adaptive
learningsampling
conduction velocity normalized mean absolute error < 0.015) when using the mesh strategy.
algorithm. x . We enforce We this constraint weakly by adding a factor α . as V = 1/$
Physics-informed filtering of 4D-flow MRI
Velocity Magnitude
V-velocity
W-velocity
Recent
3.2. advances
The KdV equation
As a mathematical model of waves on shallow water surfaces one could
consider the Korteweg-de Vries (KdV) equation. The KdV equation reads as
Discovery of ODEs Discovery of PDEs
Exact Dynamics Learned Dynamics ut = uux uxxx . (6)
Exact Dynamics Learned Dynamics
50 50 20 20
To obtain a set of training data 2.0 we simulate the KdV equation 2.0 (6) using
25
z
25
z conventional 10
spectral methods. In1.5particular, 10
we start from an1.5initial con-
1.0 1.0
dition u(0, x)0 = sin(⇡x/20), x 20.5[ 20,020] and assume periodic 0.5 boundary
x
0 0 conditions.10We integrate equation 0.0 (6) up10to the final time t = 40.0.0We use the
0.5 0.5
40 40
Chebfun package
20
[43] with a spectral 1.0
Fourier
20
discretization with 512 modes
20 0 20 0 and a fourth-order
0 10 explicit
20 30Runge-Kutta
40 temporal
0 10 integrator
20 30 40with time-step
t
0
20
40
y 0
20
40
y
size 10 4 . The solution is saved u(t, every t = 0.2 tot give us a total of 201
x x
snapshots. Raissi,
FigureOut
3: TheofKdV
1.0 M.equation:
this (2018).
data-set, Deep weto
A solution Hidden
the KdVPhysics
x)
generate a smaller
equation Models:
(left panel) Deep
training
is compared
0.75 subset,
to scat-
Raissi, M., Perdikaris, P., & Karniadakis, G. E. (2018). Multisteptered inidentified
Figure 2: Lorenz System: The exact phase portrait of the Lorenz system (left panel) is the Learning
corresponding
space system
and time,
0.5 of
solution Nonlinear
correctly by
of the learned
randomly
captures
Partial
partial Differential
di↵erential
the form ofsub-sampling
equationEquations.
Data (200 points)
(right
10000
the dynamics and accurately
panel).
0.50
0.25data
The
reproducespoints from
compared to the corresponding phase portrait of the learned dynamics (right panel).
Neural Networks for Data-driven Discovery of Nonlinear time t =the0solution 0.0 arXiv 2preprint arXiv:1801.06637. 0.00
x
to t with= 26.8. InL other
a relative words,Itwe
-error of 6.28e-02. are
should sub-sampling
be emphasized that0.25 from the orig-
the training
Dynamical Systems. arXiv preprint data
inal dataset
are
represented
collected
0.5
onlyby in
in roughly
the the
two-thirds
white training
of the
vertical lines.portion
domain between
of the
The algorithm
times
domain
is thus
t = 0 0.50 t = 26.8
and
from
extrapolating 0.75from time
time t = 0 to
The Lorenz system has a positive Lyapunov exponent, and small di↵er- 1.0 2
t = 26.8
t = 26.8. onwards.
Given The training
0.0the relative
0.2 L -error
data,
0.4 on thewetraining
0.6 areportion of the1.0domain
interested
0.8 in islearning
3.78e-02. N as a
ences between the exact and learned models grow exponentially, even though
the attractor remains High-dimensional
intact. This behavior is evidentPDEs Stochastic t PDEs
in figure 3, as we com- function of the solution u and its derivatives up to the 3rd order ; i.e.,
6
pare the exact versus the predicted trajectories. Small discrepancies due to fectiveness of tour
= 0.25
approach, we solve t =the
0.50learned partial tdi↵erential
= 0.75 equation
1 1 1
finite accuracy in the predicted dynamics lead to large errors in the fore- (7) using the PINNs algorithm [34]. We assume periodic boundary condi-
casted time-series after t > 4, despite the fact that the bi-stable structure of tions and the same initial = N (u,asuxthe
ut condition , uone
xx , u xxxto
used ). generate the original (7)
u(t, x)
u(t, x)
u(t, x)
the attractor is well captured (see figure 2). dataset.0 The resulting solution 0 of the learned partial 0 di↵erential equation as
We represent theexact
well as the solution
solutionu of
bythea 5-layer
KdV equationdeepare neural network
depicted in figurewith 50 neurons
3. This
3.3. Fluid flow behind a cylinder
figure
per hidden indicates
layer.
1 that our algorithm
Furthermore, 1 we let is N capable
to beofa1accurately
neural identifyingwith
network the 2 hidden
In this example we collect data for the fluid flow past a cylinder (see fig- underlying1 partial 2
0 di↵erential
1 equation
1 0 with1 a relative 1 L -error
0 of1 6.28e-02.
ure 4) at Reynolds number 100 using direct numerical simulations of the two layers and 100 neurons
It should be highlighted
x per hidden layer.
that the training
x These two
data are collected
networks
x in roughlyare two-
trained by
dimensional Navier-Stokes equations. In particular, following the problem
setup presented in [23] and [24], we simulate the Navier-Stokes equations de-
minimizing
thirdsthe sum
of the of squared
domain between
Exact errors
times tloss
= 0 of
Prediction andequation band (3).
t =std26.8.
Two To illustrate
The algorithm is the ef-
scribing the two-dimensional fluid flow past a circular cylinder at Reynolds thus extrapolating from timeVariance of u(t,
t = 26.8 x)
onwards. The corresponding relative
2 1.0
number 100 using the Immersed Boundary Projection Method [25, 26]. This L -error on the training portion of the domain is 3.78e-02. 0.8
6 0.5
approach utilizes a multi-domain scheme with four nested domains, each suc- A detailed study of the choice of the order is provided in section0.63.1 for the Burgers’
0.0
x
cessive grid being twice as large as the previous one. Length and time are equation. To test the algorithm even further, let us change the initial condition
0.4 to
non-dimensionalized so that the cylinder has unit diameter and the flow has cos( ⇡x/20)
0.5 and solve the KdV (6) using the conventional spectral 0.2 method
unit velocity. Data is collected on the finest domain with dimensions 9 ⇥ 4 at outlined1.0above. We compare the resulting solution to the one obtained by
a grid resolution of 449 ⇥ 199. The flow solver uses a 3rd-order Runge Kutta
Raissi, M. (2018). Forward-backward stochastic neural
0.0
solving the learned 0.2
partial 0.4
1st order
di↵erential 0.6
2nd
equation 0.8
order(5) using 1.0
3rd order
the PINNs 4th order
algo-
t
rithm [34]. 2 It is worth emphasizing that the algorithm is trained on the
networks: Deep learning of 10 high-dimensional partial Relative
dataset Yang,
Figure
L -error
Y.,
5: Burgers
depicted & equation
1.14e+00
inPerdikaris,
with 3noisy
figure and P. (2019).
data:
is Top:
1.29e-02
being MeanAdversarial
of p✓ (u|x,
tested
3.42e-02
uncertainty
on t,az),di↵erent
along with the
5.54e-02
dataset as
differential equations. arXiv preprint arXiv: location of the training data {(xi , ti ), ui }, i = 1, . . . , Nu . Middle: Prediction and predictive
at tquantification and t =in
Lphysics-informed neural
shown in figure
uncertainty =4. The
0.25, surprising
t = 0.5 0.75.2result
Bottom:reported
Variance ofin figure
t, z).4 strongly indi-
Table 2: Burgers’ equation: Relative -error betweenp✓ solutions
(u|x,
of the Burgers’ equa-
1804.07010. cates that the algorithm is accurately learning the underlying partial di↵er-
networks. Journal of equation
Computational Physics. of the highest order
tion and the learned partial di↵erential as a function
Recent advances
Fractional PDEs
Surrogate modeling & high-dimensional UQ
Zhu,
(a) GRF KLE512, test 1. Y., Zabaras, N., (b)
Koutsourelakis,
GRF KLE512, P. S., & 2.Perdikaris, P. (2019).
test
Figure 1: fPINNs for solving integral, differential, and integro-differential equations. Here we
Physics-constrained deep learning for high-dimensional
Pang, G., Lu, L., & Karniadakis, G. E. (2018). fpinns: Fractional
choose specific integro-differential operators in the form of time- and/or space- fractional deriva- surrogate modeling and uncertainty quantification without
physics-informed neural networks. arXiv preprint arXiv:
tives. fPINNs can incorporate both fractional-order and integer-order operators. In the PDE
shown in the figure, f (·) is a function of operators. The abbreviations “SM” and “AD” represent labeled data. Journal of Computational Physics, 394, 56-81.
1811.08967.
spectral methods and automatic differentiation, respectively.
In this paper, we focus on the NN approaches due to the high expressive power of NNs in
function approximation [24, 25, 26, 27]. In particular, we concentrate on physics-informed neural
Multi-fidelity modeling for stochastic systems
networks (PINNs) [28, 29, 30, 1], which belong to the second aforementioned category. The recent
applications of PINNs include (1) inferring the velocity and pressure fields from the concentra-
10 Integrated software
L. LU, X. MENG, Z. MAO, AND G. E. KARNIADAKIS
2 Reverse-mode
Sparse automatic
Differentiable
Identification on only thedifferentiation
programming
beta(t) term of ODE solutions
for scientific computing
0.0011560597253354426]
The main technical difficulty in training
Replace continuous-depth networks is performing reverse-mode
differentiation (also known as backpropagation)
0.0011560597253354426] Unknown
Portion
through the ODE solver. Differentiating through
the operations of the forward pass is straightforward, but incurs a high memory cost and introduces
additional numerical error. Unknown
Replace Replace
Unknown
Portion Portion
We treat the ODE solver as a black box, and compute gradients using the adjoint sensitivity
method (Pontryagin et al., 1962).
Replace This approach computes gradients by solving a second, aug-
Unknown
mented ODE backwards in time,Portionand is applicable to all ODE solvers. This approach scales linearly
with problem size, has low memory cost, and explicitly controls numerical error.
Consider optimizing a scalar-valued loss function L(), whose input is the result of an ODE solver:
✓ Z t1 ◆
L(z(t1 )) = L z(t0 ) + f (z(t), t, ✓)dt = L (ODESolve(z(t0 ), f, t0 , t1 , ✓)) (3)
t0
Figure 2:
processing Reverse-mode
systems (pp. differentiation of an ODE The
6571-6583). theresults
knowing valuethose
above extend of of
z(t) along
Stapor et al. its entire
(2018, sectiontra-
2.4.2). An extended version of
solution. C.,The
Rackauckas, adjoint
Ma,Y., sensitivity
Martensen, J.,Warner,method
C., Zubov,solves
Algorithm
jectory.
K., Supekar, R., ...
1 including
& However,
Ramadhan,
derivatives
A.we w.r.t.
can
(2020).
and t1 can recompute
t0 simply
Universal
be found in Appendix C. Detailed derivations
Differential Equations forall derivatives for
are provided in Appendix B. Appendix D provides Python code which computes
an augmented ODE backwards in time. The aug- z(t) backwards in time
Scientific Machine Learning. arXiv preprint arXiv:2001.04385. scipy.integrate.odeint together
by extending thewith the adjoint,
autograd automatic differentiation package. This