Applied Stochastic Processes
Applied Stochastic Processes
February 9, 2021
Contents
1 Introduction 2
1.1 Specification of Stochastic process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.1.1 Process with independent increments . . . . . . . . . . . . . . . . . . . . . . . . 3
1.1.2 Second order Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.1.3 Gaussian process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.1.4 Markov process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Covariance function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2 Martingales 7
2.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2 Polya’s Urn Scheme . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.3 Super and sub martingales . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.4 Polay Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.4.1 1D simple symmetric random walk: . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.4.2 2D simple symmetric random walk . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.4.3 3D simple symmetric random walk . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.5 Gambler’s Ruin Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
3 Markov Chains 13
3.1 Transition Probability Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.2 Initial Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.3 Two step Transition Probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
1
Chapter 1
Introduction
The theory of stochastic process is generally regarded as dynamic part of probability theory in which
one studies a collection of random variables indexed parameters. One is observing a stochastic pro-
cess whenever one examines a system developing in time in a manner controlled by probabilistic laws.
In other words, a stochastic process can be regarded as an empirical abstraction of a phenomenon
developing in nature according to some probability laws.
If a scientist is to take into account of the probabilistic nature of the phenomenon with which
which he is dealing, he should undoubtedly make use of the theory of stochastic process. The scientist
making measurements in his laboratory, the meteorologist attempting to forecast weather, the control
system engineer designing a servomechanism, the electrical engineer designing a communication sys-
tem, the hardware engineer developing a computer network, the economist studying price fluctuations
and business cycles, the seismologist studying earthquake vibrations, the neuro-surgeon studying the
electro-cardiogram are encountering problems to which stochastic process can be applied. Financial
modelling and Insurance mathematics are emerging areas where the theory of stochastic process is
widely used.
Examples of stochastic processes are provided by generation sizes of population such as bacterial
colony, life strength of items under different renewals, service times in a queuing system, waiting
times in front of a service center, displacement of a particle executing Brownian motion, number of
events during a particular time interval, number of deaths in a hospital on different days, voltage in
an electrical system during different time instants, maximum temperature in a particular place on
different days, Deviation of artificial satellite from its stipulated path at each instant of time after its
lunch, the quantity purchased of a particular inventory on different days etc. Suppose that a scientist
is observing the trajectory of a satellite after its launch. At random time intervals, the scientist is
observing whether it is deviating from the designated path or not and also the magnitude of the
deviation.
Definition 1.1. Stochastic Process is a random process which depends upon the time factor. Families
of random variables which are functions of time are known as stochastic process or random process
or random functions. Thus a stochastic process is a family of indexed random variables
{X(t, w) : t ∈ T, w ∈ Ω} defined on the probability space (Ω, S, P ) indexed by a parameter t, where
t ∈ T is called as the indexed set or ”Parameter Space” and Ω is called as ”State Space”.
A function defined on a sample space is a random variable. A typical example of random variable
is the number of aces in a hand of bridge.Number of birthdays in a company of ’n’ people. these are
all defined in a discrete sample space. Also Gambler game is a random variable, so every loss or gain
of the gambler is a random variable corresponding to the suitable game.
1. For each choice of t ∈ T, X(t, w) is a random variable.
2
2. For each choice of w ∈ Ω, X(t, w) is a random variable.
The state space ’S’ can be discrete or continuous when S is discrete by a proper labeling. We
can make the state space as the set of natural numbers N. It may be finite or infinite. the main
elements distinguishing stochastic process are the nature of state space S and parameter space T and
dependence relation among random variable λ(t). Accordingly there are four types of processes.
1. Type-1: In this case both S & T are discrete examples are provided by the number of customers
reported in a bank counter on the nth day. the nth generation size of size of population, the
number of births in a hospital on the nth day etc.
2. Type-2: In this case T is continuous and S is discrete. Examples constitute the number of
persons in queue at time t, the number of telephone calls during (0, t), the number of vehicles
passing through a specific junction during (0, t) etc.
3. Type-3: In this case T is discrete and S is continuous. Examples are provided by the renewal,
life length of the nth renewed, built, service time for the nth customer, waiting time on the nth
day to get transport, the maximum temperature in a city on the nth day etc.
4. Type-4: In this case both T & S are continuous. examples are constituted by the voltage in
an electrical system at time t, the ECG level of a patient at time t, the speed of vehicle at time
t, the displacement of a particle undergoing Brownian motion at time t, the altitude of satellite
at time t etc.
Sometimes the discrete parameter family called a stochastic sequence and the continuous parameter
family a stochastic process. We now describe some of the classical types of the stochastic process
characterized by different dependence relation among X(t).
If Z(t1 ), Z(t2 ), .... are independent random variables, then the stochastic process {X(t) : t ∈ T } is
called a process with independent increments.
e.g. X(t): Number of customers arrived in a retail counter in time (0, t).
3
Then the number of customers arrive in time t0 to t1 is X(t1 ) − X(t0 ) = Z(t1 ).
Now the function C(s, t) is known as the covariance function/ covariance process of stochastic
process {X(t) : t ∈ T }.
A stochastic process {X(t) : t ∈ T } is called the covariance stationary process or simply a station-
ary process if its mean function M (t) is independent of t and covariance function C(s, t) is a function
of time difference t − s. Let’s consider a particular whose displacement at time t is given by
X(t) = A1 (t) cos ωt + A2 (t) sin ωt
where A1 (t) and A2 (t) are independent random variables with mean 0 and constant variance σ 2
and ω is phase difference. Here
M (t) = E{X(t)}
= E{A1 (t) cos ωt + A2 (t) sin ωt}
= E{A1 (t) cos ωt} + E{A2 (t) sin ωt} = 0
C(s, t) = E[X(s).X(t)] − E{X(s)}.E{X(t)}
= E[X(s).X(t)]
= E[{A1 (s) cos ωs + A2 (s) sin ωs}{A1 (t) cos ωt + A2 (t) sin ωt}]
= E[A1 (s)A1 (t) sin(ωs) sin(ωt) + A1 (s)A2 (t) sin(ωs) cos(ωt)+
A2 (s)A1 (t) cos(ωs) sin(ωt) + A2 (s)A2 (t) cos(ωs) cos(ωt)]
= σ 2 [sin ωs sin ωt + cos ωs cos ωt]
= σ 2 cos(ωs − ωt)
= σ 2 cos ω(t − s)
= ρ(t − s)
So the stochastic process {X(t) : t ∈ T } is a covariance stationary process.
4
Past(u < s) Present(s) Future(t > s)
Future only depends on present not on past. So a Markov process can be defined as follows:
Example-1: Let {Xn : n ≥ 1} be an uncorrelated random variable with 0 mean and unit variance.
0 if m =6 n
C(n, m) = Cov(Xn , Xm ) = E(Xn Xm ) =
1 if m = 0
So the stochastic process {Xn : n ≥ 1} is covariance stationary. But it is not strictly stationary . It
′
will be strictly stationary when Xn s are identically distributed.
exp(−λt).(λt)n
P r(X(t) = n) = ; n = 0, 1, 2......
n!
Here M (t) = E{X(t)} = λt, which is not independent of t, so the Poisson process is not a stationary
process.
Example-3: Let X(t) = A1 + A2 (t) where A1 andA2 are independent random variable with
E(Ai ) = ai
V (Ai ) = 0; i = 1, 2, ...
Find the mean and covariance function and verify that the process is evolutionary.
Properties
5
1. It is symmetric in t and s i.e. C(s, t) = C(t, s) ∀ t, s ∈ T
p
2. We can apply Schwartz inequality to this covariance function C(s, t) ≤ C(s, s).C(t, t).
3. The covariance function is non-negative definite i.e.
X n
n X n
X
aj .ak .C(tj , tk ) − E[ aj .Xtj ]2 ≥ 0
j=1 k=1 j=1
where aj , ak ∈ R
4. Closure property: The covariance function also satisfies this closure axiom because sum and
product of two covariance function is also covariance function.
Remark. A stochastic process {X(t) : t ∈ T } is called covariance stationary or weakly stationary or
wide sense stationary, then for any t0 ∈ T , we can write
i.e. joint distribution of X(s + t0 ), X(t + t0 ) is same as the joint distribution of X(s), X(t) for any
value of t0 ∈ T .
By generalizing, we can write the stochastic process is said to be weakly stationary of order n, if
joint distribution of X(t1 ), X(t2 )....X(tn ) is same as joint distribution of X(t1 +h), X(t2 +h)...X(tn +h)
for any arbitrary values of t1 , t2 , ...tn for any value of t > 0. This stochastic process can be called as
weakly stationary of order and for any values of n.
6
Chapter 2
Martingales
A stochastic process is often characterized by the dependence relationships between the members of
the family. A process with a particular type of dependence through conditional mean is known as
martingale property.
The word martingales is a french word which refers to a class of betting strategy in gambling,
which were popular during 18th century in France. At present this martingale theory is an important
part of probability theory, which is used to model analytically different real life situation.
Definition 2.1. A discrete parameter stochastic process {xn : n ≥ 0} is called a martingale or
martingale process if ∀n, we have
1. E{|Xn |} < ∞
2. E{Xn+1 |Xn , Xn−1 ...X0 |} = Xn
Example:
Pn Let {Zi , i = 1, 2...} be a sequence of i.i.d random variables with mean 0 and let
Xn = i=1 Zi , then {Xn : n ≥ 1} is a martingale.Here
n
X
E{|Xn |} = E{| Zi }
i=1
n
X
≤ E|Zi |
i=1
Xn
= |EZi | < ∞
i=1
7
2.1 Examples
Example-1
Gambler’s Ruin Problem: Suppose that a gambler with initial Rs a/- plays a series of games
against a rich adversary(say a gambling machine). he stands to gain one unit of money in the game
with probability p and to lose one unit with probability q s.t. p+q=1.
the player in the first n games and Zi be the loss/gain of ith game by the
let Xn be the gain of P
n
player. so we have Xn = i=1 Zi . Here we say that the gambler ruins after nth game, if Xn = −a.
Example-2
Gambler’s ruin problem: Let the gambler play a series of games with p=q and let Yn be the
gambler’s fortune after nth game. Then {Yn : n ≥ 1} is also a martingale.
Define Xn = Yn2 − n
Given Yn , Yn−1 , ....Y0 , we have
2
P r{Xn+1 = Yn+1 − (n + 1)} = P r{Xn+1 = (Yn + 1)2 − (n + 1)}
=⇒ P r{Xn+1 = (Yn − 1)2 − (n + 1)} = 1/2
so that
so {Xn : n ≥ 1} is a martingale.
Example-3
Consider a case when p 6= q.
Y n
q
Define Vn = where Yn is the fortune of the gambler after nth game.
p
Yn+1
q
Given Yn , Yn−1 , ...Y0 we have P r Vn+1 = .So
p
Yn +1
q
P r Vn+1 = =p
p
Yn −1
q
P r Vn+1 = =q
p
8
Now
Example-4 Qn
Let {Zn : n ≥ 1} is a sequence of i.i.d random variables with EZn = 1 ∀n. Let’s define Xn = k=1 Zk .
Verify {Xn : n ≥ 1} is a martingale or not. When X0 = 1, thus verify that {Xn , n ≥ 0} is a martingale.
9
′
ANS EZn = 1 ∀n (Zn s are i.i.d)
n
Y
Xn = Zk
k=1
10
√ 1
By Stirling’s approximation n! → 2πe−n nn+ 2
2n 2n! n n
P00 = p q
n!n!
√ 1
2π.e−n .2n2n+ 2 n n
= √ 2 p q
2π .e−2n .n2n+1
1 1
= √ 22n+ 2 pn q n
2πn
1 1
= √ 22
2πn
1
= √
πn
∞ ∞
X (2n) 1 X 1
P00 =√ √
n=0
π n=0 n
is a divergent series. Therefore the state 0 is a persistent state. Hence returning to the initial state of
the particle is possible.
(2n)
X 2n! 1 2n X 2n!.4−2n
P00 = =
i+j=n
i!i!j!j! 4 i
(i!)2 ((n − i)!)2
n
X 2n!.(n!)2 .4−2n
=
i=0
(n!)2 (i!)2 ((n − i)!)2
n
2n −2n
X n!.n!
= Cn .4 .
i=0
i!.(n − i)!.i!(n − i)!
n
X
= 2nCn .4−2n . n
Ci nCn−i
i=0
= 2nCn .4−2n .2nCn
2
= 2nCn .4−2n
2
1
= √ .24n+1 .4−2n
2πn
1
= .24n+1 .2−4n
2πn
1
=
πn
∞
X (2n) 1X1
P00 =
n=0
π n
is a divergent series. So state-0 is persistent.
11
2.4.3 3D simple symmetric random walk
Suppose a particle moves in 3 dimensions, i.e. forward and backward, left and right, up and down.
So let the minimum number of steps required to return to its initial position be 2n. So the particle
moves i-steps in +ve direction and in negative direction on X-axis. Similarly j forward and backward
steps on Y-axis and k forward & backward steps in Z-axis.
2n
(2n)
X 2n! 1
P00 =
(i!j!k!)2 6
i+j+k=n
X 2n!.6−2n .(n!)2
=
(i!j!k!)2 .(n!)2
i+j+k=n
X n!.n!
= 2nCn .6−2n .
(i!)2 .(j!)2 .{(n − i − j)!}2
i+j+k=n
X 2
2n −2n 3−n .n!
= Cn .2 .
i!.j!.{(n − i − j)!}
2
1 X 3−n .n!
=√ .
πn i!.j!.{(n − i − j)!}
12
Chapter 3
Markov Chains
A discrete parameter Markov process is called as Markov chain. A Markov chain can be written as,
{xn : n ≥ 0}, for a Markov chain {xn : n ≥ 0},we have
Pr {Xn+1 = j|Xn = i, Xn−1 = i1 , Xn−2 = i2 , ......, X0 = in }
= Pr {Xn+1 = j|Xn = i}
= Pij ; i, j = 0, 1, 2...&i, j ∈ S
Here Pij denotes the probability that the system is at state i at time n and it goes to state j in next
time i.e. (n+1). This is also known as One step Transition Probabilities.
13
3.3 Two step Transition Probabilities
(2)
Define, Pij = Pr {Xn+2 = j|Xn = i}
So,
(2)
X
Pij = P (Xn+2 = j|Xn+1 = r).P (Xn+1 = r|Xn = i)
r
X
= Pir ∗ P rj
r
Chapman-Kolmogorov Equation
Statement:
The n-step Transition probability is defined by,
which is the probability that the system is in state i at time m and it reaches state j after n-
transitions,i.e.the system is in state j at time (m+n).
The (m+n) step transition probabilities must satisfy the following equation,
(m+n)
X (m) (n)
Pij = Pir ∗ Prj
r
proof:
Let r is any fixed value,we have
14
Chapter 4
15