0% found this document useful (0 votes)
55 views8 pages

SSRN 4598180

The document discusses the application of Physics-Informed Neural Networks (PINNs) in finance, specifically for modeling financial dynamics using the Heston model and Black-Scholes model. PINNs integrate physical laws into the neural network training process, ensuring predictions are consistent with known financial rules and stochastic differential equations. The results indicate that PINNs can effectively learn and predict asset prices and volatility, demonstrating their potential in mathematical finance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views8 pages

SSRN 4598180

The document discusses the application of Physics-Informed Neural Networks (PINNs) in finance, specifically for modeling financial dynamics using the Heston model and Black-Scholes model. PINNs integrate physical laws into the neural network training process, ensuring predictions are consistent with known financial rules and stochastic differential equations. The results indicate that PINNs can effectively learn and predict asset prices and volatility, demonstrating their potential in mathematical finance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Physics-Informed Neural Networks (PINNs) in

Finance
Miquel Noguer i Alonso
Julián Antolı́n Camarena

Artificial Intelligence Finance Institute


October 2023

Abstract
Physics-Informed Neural Networks (PINNs) provide a framework to
embed the Heston model dynamics directly into the learning process.
This ensures that predictions are not only data-consistent but also obey
the underlying stochastic differential equations. Physics-Informed Neural
Networks offer an innovative way to embed financial rules and physics
directly into the learning process of neural networks. By doing so, PINNs
not only provide accurate predictions but also ensure that these predic-
tions are consistent with known financial rules and structures. We test
the architecture with Black-Scholes and the Heston models as parametric
models. The architecture seems to learn them correctly.

1 Introduction
Finance is a field that often relies on differential equations, particularly partial
and stochastic differential equations (PDEs and SDEs, respectively), to model
various phenomena such as options pricing. Physics-Informed Neural Networks
(PINNs) provide a promising methodology to solve these PDEs by embedding
the physics directly into the architecture and training process.The architecture
was invented by [Raissi et al., 2019].

2 Physics-Informed Neural Networks


2.1 Theoretical considerations
Physics-Informed Neural Networks (PINNs) are a type of neural network that
incorporates physical knowledge, often in the form of differential equations, into
the learning process. This approach allows the neural network to be guided by

Electronic copy available at: https://ssrn.com/abstract=4598180


known physical laws, which can improve the accuracy and generalization of the
model, especially when training data is sparse.
The main idea behind PINNs is to use the neural network to approximate a
function that satisfies a given differential equation. During training, in addition
to minimizing the error between the network’s predictions and the available
data, the network also minimizes the error between its predictions and the
known physical laws represented by the differential equations. The main objec-
tive of this paper is to show that PINNs can be used in mathematical finance
to solve stochastic differential equations (SDEs) by using the Heston model as
a test bed.
A Physics-Informed Neural Network (PINN) is defined by a neural network
architecture and a loss function. The loss function consists of two terms: a data
mismatch term and a physics-informed regularization term:

L = Ldata + Lphysics (1)


Where Ldata measures the mismatch between the neural network predictions
and observed data, and Lphysics ensures the predictions satisfy the Black-Scholes
PDE.
Given a neural network fθ (S, t) with parameters θ, our prediction for the
option price is V (S, t) = fθ (S, t). The physics-informed regularization term can
be computed as:
2
∂fθ 1 ∂ 2 fθ ∂fθ
Lphysics = + σ2 S 2 2
+ rS − rfθ (2)
∂t 2 ∂S ∂S
Training involves minimizing the total loss L with respect to the network
parameters θ.

2.2 Bayesian Interpretation of PINNs


The PINN architecture has a Bayesian interpretation. The loss, L can be con-
sidered as the negative log-likelihood of a posterior distribution for fθ , where
the data mismatch comes from the likelihood function and the physics-informed
loss is the prior information available on fθ . Thus, in a Bayesian framework, for
data X, we have:
likelihood prior
posterior p(X|fθ )p(fθ |D)p(D)
p(fθ |X) = , (3)
p(X)
evidence

where the prior is a distribution over differential operators D known to describe


the dynamics of the function fθ , i.e., that satisfy
2
Dfθ = 0, Lphysics = |Dfθ | . (4)

For many dynamical problems, the differential operator in question is known


in general form, it is the parameters that specify it that need to be fit from

Electronic copy available at: https://ssrn.com/abstract=4598180


data. In the case that we know the operator exactly, p(D) = δ(D) (Dirac delta
distribution), i.e. there is certainty about the specific operator.
In the more general case we do not precisely know the parameters, λ, that
specify the operator - hyperparameters from the Bayesian perspective. This
means that the operator itself becomes stochastic as there is now need for a
distribution over λ to describe our uncertainty in the dynamics of the system.
For such cases, we may write
p(fθ , Dλ , λ) = p(fθ |Dλ , λ)p(Dλ |λ)p(λ). (5)
The advantage afforded by PINNs in this case is that operator or system pa-
rameters can be fit with data. Thus PINNs allow dynamics to be learned in a
data-driven fashion.
Lastly, note that in the Bayesian view, the prior imposes a regularization
constraint on estimation since it will penalize parameters that stray far from
what is already known about the parameters and that don’t satisfy the data via
the likelihood function. In this sense, the well-known fact about PINNs that
imposing physical dynamics as a constraint acts as a regularization term for
the neural network arises naturally . This term, represented by Lphysics , will
penalize functions that poorly satisfy the dynamics, but that also satisfy the
data model poorly. Thus, a regularized function will be forced to satisfy both
constraints.
As a test bed for PINNs, we consider the Heston model, which is a stochastic
volatility model describing the joint evolution of an asset price and its volatility.
We aim to utilize PINNs to find solutions that both fit data and obey the model’s
dynamics.
Here are some key references on the topic: [Raissi et al., 2019], [Raissi and Karniadakis, 2018],
[Raissi et al., 2017], and [Patrick Kidger and Lyons, 2021]. WE will use two
benchmark models in finance Black-Scholes [Black and Scholes, 1973] and the
Heston Model [Heston, 1993].

3 Dynamical Systems in Finance


As noted in the introduction, quantitative descriptions of finance rely heavily
on PDEs and SDEs. These are natural choices as asset prices can depend on
several variables and change over time randomly.
We provide two well-kon examples below.

3.1 The Black-Scholes Model


Consider the Black-Scholes equation, which is a PDE used to describe the price
of a European call option:

∂V 1 ∂2V ∂V
+ σ 2 S 2 2 + rS − rV = 0 (6)
∂t 2 ∂S ∂S
where V (S, t) is the option price, S is the stock price, σ is the volatility, and
r is the risk-free interest rate.

Electronic copy available at: https://ssrn.com/abstract=4598180


Figure 1: Typical simulation trajectory of the Heston model. Top panel: Sim-
ulation of stock price St under instantaneous volatility νt . Bottom panel: In-
stantaneous volatility νt .

3.2 The Heston Model


In the previous section, we described how PINNs can be used to solve a well-
known PDE in mathematical finance.
The main objective of this paper is to show that PINNs can be used in
mathematical finance to solve SDEs. We will utilize the celebrated Heston
model for asset volatility as a test bed.
The Heston model is given by:

dSt = rSt dt + vt St dWtS

dvt = κ(θ − vt )dt + ξ vt dWtv ,

where the Feller condition


2κθ > ξ 2 (7)
is required for positivity of the processes. Figure 1 displays a typical trajectory
of a stock price and its instantaneous volatility.

Electronic copy available at: https://ssrn.com/abstract=4598180


Figure 2: Log-loss curves of training and validation losses.

4 The Physics-Informed Neural Network Approach


4.1 Setup
Let fθS (t) and fθv (t) be neural network representations for St and vt respectively.
The PINN loss function is:

L = Ldata + Lphysics
Where:

2 2
dfθS dfθv
− rfθS − fθv fθS − κ(θ − fθv ) − ξ fθv
p p
Lphysics = +
dt dt

4.2 Experiments
The PINN was trained on 10,000 trajectories of simulated asset price-volatility
pairs of length 252 days. This is to simulate one year of market data, assuming
252 trading days per year. The Wiener process was simulated with correlated
Gaussian noise with correlation coefficient ρ = 0.5, drift µ = 0.01, asset price
standard deviation σS = 2.0, volatility standard deviation ξ = 0.1, long-term
mean of volatility of θ = 0.2, mean reversion rate of κ = 0.075. The experiment
was repeated 100 times for statistically meaningful results.

Electronic copy available at: https://ssrn.com/abstract=4598180


Figure 3: Numerical results of Heston PINN. Top panel: . Stock price over a
trading year in blue, with the mean PINN prediction in black and the confidence
bands at one daily standard deviation. Bottom panel: Volatility of the stock
price over a trading year in blue, with the mean PINN prediction in black and
the confidence bands at one daily standard deviation.

Electronic copy available at: https://ssrn.com/abstract=4598180


Quantity Value Ensemble average Relative MAE
Stock price $ 16.17 $ 101.9 0.16
Volatility 0.016 0.072 0.22

Table 1: Results over 10,000 trajectories of length 252 days each.

4.3 Results
We find that the PINNs train the Heston model adequately, as evidenced in
Fig. 2. The estimated mean is largely within one daily standard deviation of
the ground truth, indicating that the Heston model is being learned relatively
well. This is largely due to the simple structure of the Heston model, but also
alludes to the power of the PINN approach.
Figure 3 shows a typical realization of the Heston price-volatility pair (top
and bottom panels, respectively). The ground truth is shown in blue, the PINN
estimate is shown in black, and the red dotted curves show the mean absolute
error (MAE) averaged over all trajectories at each time step. Table 1 shows
quantitatively these results.

5 Conclusion
Physics-Informed Neural Networks (PINNs) provide a framework to embed the
Heston model dynamics directly into the learning process. This ensures that
predictions are not only data-consistent but also obey the underlying stochastic
differential equations. Physics-Informed Neural Networks offer an innovative
way to embed financial rules and physics directly into the learning process of
neural networks. By doing so, PINNs not only provide accurate predictions but
also ensure that these predictions are consistent with known financial rules and
structures.We test the architecture with Black-Scholes and the Heston models
as parametric models. The architecture seems to correctly learn them.

References
[Black and Scholes, 1973] Black, F. and Scholes, M. (1973). The pricing of
options and corporate liabilities. Journal of Political Economy, 81(3):637–
654.
[Heston, 1993] Heston, S. L. (1993). A closed-form solution for options with
stochastic volatility with applications to bond and currency options. The
Review of Financial Studies, 6(2):327–343.
[Patrick Kidger and Lyons, 2021] Patrick Kidger, James Foster, X. L. H. O. and
Lyons, T. (2021). Neural sdes as infinite-dimensional gans. arXiv preprint
arxiv:2102.03657.

Electronic copy available at: https://ssrn.com/abstract=4598180


[Raissi and Karniadakis, 2018] Raissi, M. and Karniadakis, G. E. (2018). Hid-
den physics models: Machine learning of nonlinear partial differential equa-
tions. Journal of Computational Physics, 357:125–141.
[Raissi et al., 2019] Raissi, M., Perdikaris, P., and Karniadakis, G. (2019).
Physics-informed neural networks: A deep learning framework for solving for-
ward and inverse problems involving nonlinear partial differential equations.
Journal of Computational Physics, 378:686–707.
[Raissi et al., 2017] Raissi, M., Perdikaris, P., and Karniadakis, G. E. (2017).
Physics informed deep learning (part i): Data-driven solutions of nonlinear
partial differential equations. arXiv preprint arXiv:1711.10561.

Electronic copy available at: https://ssrn.com/abstract=4598180

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy