0% found this document useful (0 votes)
64 views24 pages

Stationary Stochastic Processes: Maria Sandsten

1. The document provides summaries of important topics in stationary stochastic processes, including the definition of a stationary process, properties of covariance functions and spectral densities, filtering of stationary processes, and AR and MA processes. 2. It also provides examples of old exam questions, including identifying realizations, covariance functions, and spectral densities of AR and MA processes from figures, and computing the covariance function of a given discrete-time stationary process. 3. The key topics covered are the mathematical definitions and properties of stationary stochastic processes, AR and MA processes, and examples of exam questions involving the analysis and computation of covariance functions and identifying time series models.

Uploaded by

ganjey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
64 views24 pages

Stationary Stochastic Processes: Maria Sandsten

1. The document provides summaries of important topics in stationary stochastic processes, including the definition of a stationary process, properties of covariance functions and spectral densities, filtering of stationary processes, and AR and MA processes. 2. It also provides examples of old exam questions, including identifying realizations, covariance functions, and spectral densities of AR and MA processes from figures, and computing the covariance function of a given discrete-time stationary process. 3. The key topics covered are the mathematical definitions and properties of stationary stochastic processes, AR and MA processes, and examples of exam questions involving the analysis and computation of covariance functions and identifying time series models.

Uploaded by

ganjey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Summary and old exam exercises

Stationary stochastic processes

Maria Sandsten

Lecture 11

October 14 2019

1 / 24
Summary and old exam exercises

Examination information

2 / 24
Summary and old exam exercises

Important course topics


I Properties of a stationary stochastic process.
I Calculation of covariance functions from spectral densities.
I Characteristics of covariance function, spectral density and
corresponding realization.
I Calculation of mean, covariance functions and spectral densities of
filtered processes.
I Calculation of parameters, covariance functions and spectral
densities of AR-process and MA-processes (pole-zero plots).
I Cross-covariance and cross-spectrum calculations.
I Calculations of derivatives and integrals.
I Optimal filters (MSE, Wiener and matched filter).
I Sampling.
I Variance calculation of the mean value estimate.
I Knowledge of basic spectral estimation techniques.
3 / 24
Summary and old exam exercises

A stationary stochastic process

A continuous time process, X (t), t ∈ R, or a discrete time process


Xt ,t = 0, ±1, . . ., is weakly stationary if,
I The expected value, E [X (t)] = m
I The covariance function, C [X (s), X (t)] = r (t − s) = r (τ )
I The variance, V [X (t)] = r (0)
I The correlation function, ρ(τ ) = r (τ )/r (0)
For a real-valued stationary stochastic process we have
I V [X (t)] = r (0) ≥ 0
I r (−τ ) = r (τ )
I |r (τ )| ≤ r (0)

4 / 24
Summary and old exam exercises

Spectral density
For a weakly stationary process there exists a, positive, symmetrical and
integrable spectral density function R(f ) such that,
Z ∞ Z ∞
r (τ ) = R(f )e i2πf τ
df , R(f ) = r (τ )e −i2πf τ dτ,
−∞ −∞

in continuous time, and


Z 1/2 ∞
X
r (τ ) = R(f )e i2πf τ df , R(f ) = r (τ )e −i2πf τ ,
−1/2 τ =−∞

in discrete time. If E [X (t)] = 0 the power and the variance is


Z ∞
2
E [X (t)] = V [X (t)] = r (0) = R(f )df .
−∞

5 / 24
Summary and old exam exercises

Discrete time white noise

For discrete time white noise,

R(f ) = σ 2 , −1/2 < f ≤ 1/2,

and
σ2

τ =0
r (τ ) =
0 τ = ±1, ±2, . . .

6 / 24
Summary and old exam exercises

Random harmonic function


The random harmonic function

X (t) = A cos(2πf0 t + φ), t ∈ R,

is a stationary process with A > 0 and φ ∈ Rect(0, 2π). The covariance


function is
E [A2 ]
r (τ ) = cos(2πf0 τ ), τ ∈ R,
2
for all f0 > 0 and the spectral density is defined as

E [A2 ]
R(f ) = (δ(f − f0 ) + δ(f + f0 )).
4
The random harmonic function is also defined for discrete time with
0 < f0 ≤ 0.5

7 / 24
Summary and old exam exercises

Filtering of stationary processes

For continuous time processes the output Y (t), t ∈ R, is obtained from


the input X (t), t ∈ R, through
Z ∞
Y (t) = h(u)X (t − u)du,
−∞

where h(t) is the impulse response. The corresponding frequency


function is Z ∞
H(f ) = h(t)e −i2πft dt.
−∞

For discrete time processes, the integrals are changed to sums.

8 / 24
Summary and old exam exercises

Filtering of stationary processes


The mean value,
Z ∞
mY = mX h(u)du = mX H(0),
−∞

the covariance function,


Z ∞Z ∞
rY (τ ) = h(u)h(v )rX (τ + u − v )dudv ,
−∞ −∞

and the spectral density,

RY (f ) = |H(f )|2 RX (f ),

For discrete time processes, the integrals are changed to sums.

9 / 24
Summary and old exam exercises

The MA(q)-process

A moving average process of order q, MA(q), is given by

Xt = c0 et + c1 et−1 + . . . + cq et−q ,

where et , t = 0, ±1, ±2, . . . , is a zero-mean white noise Gaussian noise


with variance σ 2 . The expected value mX = 0 and the covariance
function is
rX (τ ) = C [Xt , Xt+τ ] 6= 0 |τ | ≤ q.
The spectral density is the discrete time Fourier transform of rX (τ ) or

RX (f ) = σ 2 |c0 + c1 e −i2πf + . . . + cq e −i2πfq |2 .

10 / 24
Summary and old exam exercises

The AR(p)-process

An auto-regressive process of order p, AR(p), is given by

Xt + a1 Xt−1 + a2 Xt−2 + . . . + ap Xt−p = et ,


where, et , t = 0, ±1, ±2 . . ., is zero-mean white Gaussian noise with
variance σ 2 . The expected value, mX = 0, and the covariance function
solves the Yule-Walker equations
 2
σ τ =0
rX (τ ) + a1 rX (τ − 1) + . . . + ap rX (τ − p) =
0 τ = 1, 2, ...

The spectral density is

σ2
RX (f ) = .
|1 + a1 e −i2πf + . . . + ap e −i2πfp |2

11 / 24
Summary and old exam exercises

Old exam 180108-1


The following figures show realizations, covariance functions and spectral densities of
one AR(p)-process and one MA(q)-process. Determine, with a motivation, what
figures that are realizations, covariance functions and spectral densities. Also state
which realization, covariance function and spectral density that are connected. Decide
of which type the different processes are (AR or MA) and which orders they have. The
order can be assumed to be less than 10 for both processes. Important! To receive full
number of credits correct motivations have to be given for all answers.
(10p)

12 / 24
Summary and old exam exercises

Old exam 080306-5

A weakly stationary process Xt , t = 0, ±1, ±2 . . ., is defined as

Xt − Xt−1 + 0.5Xt−2 = et + B

where et is white noise with expected value zero and variance σ 2 = 1.


The process B is a stochastic variable with variance σB2 , which is
independent of et . Compute the covariance function rX (τ ) for
τ = 0, ±1, ±2, ±3.

(20p)

13 / 24
Summary and old exam exercises

Solution
Define a new process Zt = Xt − cB to be zero-mean and accordingly an
AR(2)-process defined as Zt − Zt−1 + 0.5Zt−2 = et . We apply
Xt = Zt + cB, in Xt − Xt−1 + 0.5Xt−2 = et + B and get

E [Zt + cB] − E [Zt−1 + cB] + 0.5E [Zt−2 + cB] = E [et ] + E [B].

As E [Zt ] = 0 and E [et ] = 0 by definition,

cE [B] − cE [B] + 0.5cE [B] = E [B],

giving c = 2. The Yule-Walker-equations for Zt are the same as in the


AR(2)-example from Example 8.pdf among the lecture notes, giving for
σ 2 = 1, rZ (0) = 2.4, rZ (±1) = 1.6, rZ (±2) = 0.4 and rZ (±3) = −0.4.
Accordingly

rX (τ ) = C [Zt + 2B, Zt+τ + 2B] = rZ (τ ) + 4σB2 .

(Also read Example 4.12, page 105).

14 / 24
Summary and old exam exercises

Cross-covariance, cross spectrum and coherence spectrum


The cross-covariance function is defined as

rX ,Y (τ ) = C [X (t), Y (t + τ )].

Note that rX ,Y (τ ) 6= rX ,Y (−τ ) but rX ,Y (τ ) = rY ,X (−τ ).


The complex-valued cross spectrum RX ,Y (f ) is defined so that
Z
rX ,Y (τ ) = RX ,Y (f )e i2πf τ df .

The quadratic coherence spectrum is defined

|RX ,Y (f )|2
κ2X ,Y (f ) = , 0 ≤ κ2X ,Y ≤ 1.
RX (f )RY (f )

The same formulations apply for discrete time processes.

15 / 24
Summary and old exam exercises

Differentiation
A weakly stationary process X (t), t ∈ R, is said to be differentiable in
quadratic mean with the derivative X 0 (t) if rX (τ ) is twice differentiable
or if Z ∞
V [X 0 (t)] = (2πf )2 RX (f )df < ∞.
−∞

We have mX 0 = 0,
rX 0 (τ ) = −rX00 (τ ),
and
RX 0 (f ) = (2πf )2 RX (f ).

The cross-covariance function, rX ,X 0 (t, t + τ ) = rX0 (τ ) and rX ,X 0 (t, t) = 0.

16 / 24
Summary and old exam exercises

Old exam 171028-3

A zero-mean, weakly stationary, Gaussian process X (t), t ∈ R has the spectral density

RX (f ) = πe −2π|f | .

State, with a short motivation, for each of the following statements if it is right or
wrong. Each correct answer with motivation gives 2 credits.
1
a) The covariance function is rX (τ ) = 1+τ 2
.
b) The process is differentiable in quadratic mean.
c) The variance V [(X (0) + X (t))/2] approaches zero 0 when t → ∞.
d) After filtering through a linear filter with frequency function H(f ) = 1 + if , the
spectral density of the process is constant for all frequencies.
e) A new process Y (t) = 3X (t) − 2X 0 (t) is created. The process Y (t) is a
Gaussian process.

(10p)

17 / 24
Summary and old exam exercises

Old exam 181102-2

A stationary Gaussian process X (t), t ∈ R, has expected value


E [X (t)] = 2 and covariance function
2
rX (τ ) = e −τ .

Compute the probability,

P(X 0 (t) ≥ X (t)).

(10p)

18 / 24
Summary and old exam exercises

Optimal filters
I Mean square error optimal filter, minimize E [(Y (t) − S(t))2 ].
I The matched filter for binary detection is

hopt (u) = s(T − u),

for the zero-mean white noise disturbance case. For equal decision
errors, the decision level k = sout (T )/2 with errors
α = β = 1 − Φ( sout (T )
2σN ).
I The Wiener filter frequency function is

RS (f )
Hopt (f ) = ,
RS (f ) + RN (f )

where RS (f ) and RN (f ) are the zero-mean signal and noise spectral


densities respectively.

19 / 24
Summary and old exam exercises

Sampling
The continuous time process Y (t), t ∈ R is sampled to the discrete time
sequence Zt = Y (t), t = 0, ±d, ±2d, . . .. The covariance function is

rZ (τ ) = rY (τ ), τ = 0, ±d, ±2d, . . .

and the spectral density



X
RZ (f ) = RY (f + kfs ) − fs /2 < f ≤ fs /2.
k=−∞

with fs = 1/d as the sampling frequency. With τ = nd, rZ (τ ) is


converted to rX (n), n = 0, ±1, ±2, . . ., and

RX (ν) = fs · RZ (νfs ),

for ν = f · d = f /fs .

20 / 24
Summary and old exam exercises

Estimation of mean
If X (t), t = 1, 2 . . . is weakly stationary with the unknown expected value
m then
n
1X
m̂n = X (t)
n t=1

is an unbiased estimate of m as E [m̂n ] = m. The variance is


n n n−1
1X 1X 1 X
V [m̂n ] = C [ x(t), x(s)] = 2 (n − |τ |)r (τ ).
n t=1 n s=1 n τ =−n+1

For large n,
1X
V [m̂n ] ≈ r (τ ).
n τ

21 / 24
Summary and old exam exercises

Old exam 171028-5


We define the weakly stationary Gaussian process Y (t) = m + X (t), t ∈ R, where m
is an unknown constant expected value. The process X (t) is assumed to have
E [X (t)] = 0 and spectral density

 1 for |f − 2.1| ≤ 0.1,
RX (f ) = 1 for |f + 2.1| ≤ 0.1
0 otherwise.

An estimate of m should be found as


Y (d) + Y (2d) + . . . Y (nd)
m
bn = ,
n
by sampling the process Y (t) with sample distance d. There are two possible choices
of sample distances, d = 1/4 and d = 1/2.
a) Calculate the covariance function rY (τ ).
(4p)
b) Determine the spectral density of the sampled process Zt = Y (t),
t = 0, ±d, ±2d, . . . for the two suggested sample distances, d = 1/4 and
d = 1/2.
(10p)
b n ], when n → ∞ for d = 1/4 and d = 1/2. Which of the
c) Determine nV [m
suggested sample distances would you prefer?
(6p)
22 / 24
Summary and old exam exercises

Spectrum estimation
The periodogram is defined as
1 X
Rbx (f ) = | x(t)w (t)e −i2πft |2 ,
n t

where w (t) is a data window. With the Hanning window, the spectrum
estimate will have better leakage properties (lower sidelobes), although
the resolution is somewhat degraded, (wider mainlobe), in comparison to
the rectangular window.

The variance of the periodogram is


V [Rbx (f )] ≈ RX2 (f ) 0 < |f | < 0.5
Dividing the sequence into K possibly overlapping sequences and
calculating the average of K approximately uncorrelated estimates
(Welch method), reduces the variance to
1 2
V [Rbmv (f )] ≈ R (f ).
K X
23 / 24
Summary and old exam exercises

Old exam 180108-3


A B
30 30

25 25

20 20

15 15

(E[R* (f)])

10log 10 (E[Rx* (f)])


10 10

x
5 5
10
10log

0 0

-5 -5

-10 -10

-15 -15

-20 -20
0 0.1 0.2 0.3 0.4 0 0.1 0.2 0.3 0.4
f f

a) In the figures above, the expected values of the periodogram the rectangular
and the Hanning window, calculated from n samples, are illustrated with solid
lines. Which of the figures shows the expected value using the two windows.
Motivate your answer.
b) The sequence is divided into K sequences and the final estimate is calculated as
the average of K uncorrelated periodograms. How much does the variance
decrease in comparison to the periodogram in a)?
(10p) 24 / 24

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy