Stationary Stochastic Processes: Maria Sandsten
Stationary Stochastic Processes: Maria Sandsten
Maria Sandsten
Lecture 11
October 14 2019
1 / 24
Summary and old exam exercises
Examination information
2 / 24
Summary and old exam exercises
4 / 24
Summary and old exam exercises
Spectral density
For a weakly stationary process there exists a, positive, symmetrical and
integrable spectral density function R(f ) such that,
Z ∞ Z ∞
r (τ ) = R(f )e i2πf τ
df , R(f ) = r (τ )e −i2πf τ dτ,
−∞ −∞
5 / 24
Summary and old exam exercises
and
σ2
τ =0
r (τ ) =
0 τ = ±1, ±2, . . .
6 / 24
Summary and old exam exercises
E [A2 ]
R(f ) = (δ(f − f0 ) + δ(f + f0 )).
4
The random harmonic function is also defined for discrete time with
0 < f0 ≤ 0.5
7 / 24
Summary and old exam exercises
8 / 24
Summary and old exam exercises
RY (f ) = |H(f )|2 RX (f ),
9 / 24
Summary and old exam exercises
The MA(q)-process
Xt = c0 et + c1 et−1 + . . . + cq et−q ,
10 / 24
Summary and old exam exercises
The AR(p)-process
σ2
RX (f ) = .
|1 + a1 e −i2πf + . . . + ap e −i2πfp |2
11 / 24
Summary and old exam exercises
12 / 24
Summary and old exam exercises
Xt − Xt−1 + 0.5Xt−2 = et + B
(20p)
13 / 24
Summary and old exam exercises
Solution
Define a new process Zt = Xt − cB to be zero-mean and accordingly an
AR(2)-process defined as Zt − Zt−1 + 0.5Zt−2 = et . We apply
Xt = Zt + cB, in Xt − Xt−1 + 0.5Xt−2 = et + B and get
14 / 24
Summary and old exam exercises
rX ,Y (τ ) = C [X (t), Y (t + τ )].
|RX ,Y (f )|2
κ2X ,Y (f ) = , 0 ≤ κ2X ,Y ≤ 1.
RX (f )RY (f )
15 / 24
Summary and old exam exercises
Differentiation
A weakly stationary process X (t), t ∈ R, is said to be differentiable in
quadratic mean with the derivative X 0 (t) if rX (τ ) is twice differentiable
or if Z ∞
V [X 0 (t)] = (2πf )2 RX (f )df < ∞.
−∞
We have mX 0 = 0,
rX 0 (τ ) = −rX00 (τ ),
and
RX 0 (f ) = (2πf )2 RX (f ).
16 / 24
Summary and old exam exercises
A zero-mean, weakly stationary, Gaussian process X (t), t ∈ R has the spectral density
RX (f ) = πe −2π|f | .
State, with a short motivation, for each of the following statements if it is right or
wrong. Each correct answer with motivation gives 2 credits.
1
a) The covariance function is rX (τ ) = 1+τ 2
.
b) The process is differentiable in quadratic mean.
c) The variance V [(X (0) + X (t))/2] approaches zero 0 when t → ∞.
d) After filtering through a linear filter with frequency function H(f ) = 1 + if , the
spectral density of the process is constant for all frequencies.
e) A new process Y (t) = 3X (t) − 2X 0 (t) is created. The process Y (t) is a
Gaussian process.
(10p)
17 / 24
Summary and old exam exercises
(10p)
18 / 24
Summary and old exam exercises
Optimal filters
I Mean square error optimal filter, minimize E [(Y (t) − S(t))2 ].
I The matched filter for binary detection is
for the zero-mean white noise disturbance case. For equal decision
errors, the decision level k = sout (T )/2 with errors
α = β = 1 − Φ( sout (T )
2σN ).
I The Wiener filter frequency function is
RS (f )
Hopt (f ) = ,
RS (f ) + RN (f )
19 / 24
Summary and old exam exercises
Sampling
The continuous time process Y (t), t ∈ R is sampled to the discrete time
sequence Zt = Y (t), t = 0, ±d, ±2d, . . .. The covariance function is
rZ (τ ) = rY (τ ), τ = 0, ±d, ±2d, . . .
RX (ν) = fs · RZ (νfs ),
for ν = f · d = f /fs .
20 / 24
Summary and old exam exercises
Estimation of mean
If X (t), t = 1, 2 . . . is weakly stationary with the unknown expected value
m then
n
1X
m̂n = X (t)
n t=1
For large n,
1X
V [m̂n ] ≈ r (τ ).
n τ
21 / 24
Summary and old exam exercises
Spectrum estimation
The periodogram is defined as
1 X
Rbx (f ) = | x(t)w (t)e −i2πft |2 ,
n t
where w (t) is a data window. With the Hanning window, the spectrum
estimate will have better leakage properties (lower sidelobes), although
the resolution is somewhat degraded, (wider mainlobe), in comparison to
the rectangular window.
25 25
20 20
15 15
(E[R* (f)])
x
5 5
10
10log
0 0
-5 -5
-10 -10
-15 -15
-20 -20
0 0.1 0.2 0.3 0.4 0 0.1 0.2 0.3 0.4
f f
a) In the figures above, the expected values of the periodogram the rectangular
and the Hanning window, calculated from n samples, are illustrated with solid
lines. Which of the figures shows the expected value using the two windows.
Motivate your answer.
b) The sequence is divided into K sequences and the final estimate is calculated as
the average of K uncorrelated periodograms. How much does the variance
decrease in comparison to the periodogram in a)?
(10p) 24 / 24