PSD Periodogram
PSD Periodogram
Roy Smith
2018-10-5 3.1
Methods:
§ Sinusoidal correlation methods
§ Empirical Transfer Function Estimation (ETFE)
§ Spectral estimation
2018-10-5 3.2
Inverse Fourier transform
where k “ ´8, . . . , 8.
2018-10-3 2.11
2018-10-3 2.12
Autocorrelation (finite energy signals)
2018-10-3 2.13
xpkq
2.0
1.0
0 index: k
10 20 30 40 50 60
´1.0
´2.0
2018-10-3 2.14
Autocorrelation example (finite energy signal)
8
ÿ
Rx pτ q “ xpkqxpk ´ τ q, τ “ ´8, . . . , 0, . . . , 8.
k“´8
30 Rx pτ q
20
10
´10
2018-10-3 2.15
8
ÿ
jω
Sx pe q “ Rx pτ qe´jωτ
´8
Sx pejω q
100
80
60
40
20
ω
(rad/sample)
´π ´π{2 0 π{2 π
2018-10-3 2.16
Discrete periodic signals
ejnω0 , n “ 0, 1, . . . , M {2.
2018-10-3 2.17
2018-10-3 2.18
Autocorrelation (periodic signals, N “ M )
2018-10-3 2.19
xpkq
2.0
1.0
index: k
N 2N
´1.0
´2.0
2018-10-3 2.20
Autocorrelation example (periodic signal, N “ M )
N ´1
1 ÿ
Rx pτ q “ xpkqxpk ´ τ q, τ “ ´N {2 ` 1, . . . , N {2.
N k“0
1.0 Rx pτ q
0.5
lag: τ
´N {2 ` 1 N {2 N 3N {2
´0.5
2018-10-3 2.21
´1
Nÿ
2πn
φx pe jωn
q “ Rx pτ qe´jωn τ , ωn “ , n “ ´N {2 ` 1, . . . , N {2.
τ “0
N
φx pejωn qq
3.0
2.0
1.0
ω
(rad/sample)
´π ´π{2 0 π{2 π 2π 3π
2018-10-3 2.22
Cross-correlation (periodic signals, N “ M )
2018-10-3 2.23
ypkq
3.0
2.0
1.0
0 index: k
N
´1.0
´2.0
´3.0
2018-10-3 2.24
Cross-correlation example (periodic signal, N “ M )
N ´1
1 ÿ
Ryu pτ q “ ypkqupk ´ τ q, τ “ ´N {2 ` 1, . . . , N {2.
N k“0
1.0 Rx pτ q
0.75
0.5
0.25
lag: τ
´N {2 ` 1 N {2
´0.25
´0.5
2018-10-3 2.25
´1
Nÿ
2πn
φyu pe jωn
q“ Ryu pτ qe´jωn τ , ωn “ , n “ ´N {2 ` 1, . . . , N {2.
τ “0
N
ˇ ˇ
5.0 ˇφyu pejωn qqˇ
2.5
ω
(rad/sample)
´π ´π{2 π{2 π
2018-10-3 2.26
Noise models: random signals
epkq
Hpejω q
vpkq
ypkq upkq
` Gpejω q
8
ÿ
vpkq “ hplqepk ´ lq “ H epkq with epkq P N p0, λq.
l“0
2018-10-3 2.27
Rx pτ q “ Etxpkqxpk ´ τ qu
“ Etxpkqx˚ pk ´ τ qu (in the complex case)
“ Etxpkqx˚ pk ´ τ qu (in the multivariable case)
2018-10-3 2.28
Power spectral density (random signals)
2018-10-3 2.29
Basic properties
Autocovariance:
Rx p´τ q “ Rx˚ pτ q
Rx p0q ě |Rx pτ q| for all τ ą 0
Spectral density:
φx pejω q P R
φx pejω q ě 0 for all ω
φx pejω q “ φx pe´jω q for all real-valued xpkq
2018-10-3 2.30
Cross-covariance (random signals)
Ryu pτ q “ Etypkqupk ´ τ qu
2018-10-3 2.31
2018-10-3 2.32
Discrete-Fourier Transform (finite-length signals)
xpkq, k “ 0, . . . , K ´ 1.
2018-10-3 2.33
Periodogram
1 ÿ
N
lim |τ Rv pτ q| “ 0
N ÝÑ8 N
τ “´N
2018-10-3 2.34
Bibliography
Fourier transforms:
A.V. Oppenheim, A.S. Wilsky & S.H. Nawab, Signals & Systems, Prentice-Hall, 2nd
Ed., 1996.
Spectral estimation:
Lennart Ljung, System Identification; Theory for the User, (see Section 6.4)
Prentice-Hall, 2nd Ed., 1999.
P. Stoica & R. Moses, Introduction to Spectral Analysis (see Chapters 1 and 2),
Prentice-Hall, 1997.
Periodograms:
Arthur Schuster, “The Periodogram of Magnetic Declination as obtained from the
records of the Greenwich Observatory during the years 1871–1895,” Trans. Cambridge
Phil. Soc., vol. 18, pp. 107–135, 1900.
Lennart Ljung, System Identification; Theory for the User, (see Section 2.2)
Prentice-Hall, 2nd Ed., 1999.
2018-10-3 2.35
Open-loop identification configuration
epkq
Hpejω q
vpkq
ypkq upkq
` Gpejω q
Model assumptions:
2018-10-5 3.3
epkq
Hpejω q
vpkq
ypkq upkq
` Gpejω q
Identification problem:
Given data,
tupkq, ypkqu, k “ 0, . . . , K ´ 1,
which we assume to have come from a noisy experiment with a plant, Gpejω q:
Ĝpejω q « Gpejω q.
2018-10-5 3.4
Identification error: bias and variance
Note that,
These are common measures of model quality (but we can’t calculate them
exactly in practice).
2018-10-5 3.5
epkq
Hpejω q
vpkq
ypkq upkq
` Gpejω q
8
ÿ
ypkq “ gplqupk ´ lq ` vpkq,
l“0
Idealized case:
Y pejω q jω V pejω q
“ Gpe q ` « Gpejω q
U pejω q U pejω q
2018-10-5 3.6
Empirical transfer function estimation (ETFE)
Motivation:
8
ÿ
Time domain: ypkq “ gplqupk ´ lq ` vpkq
l“0
Y pejω q jω V pejω q
“ Gpe q `
U pejω q U pejω q
2018-10-5 3.7
Approximation:
´1
Nÿ 8
ÿ
´jωn k
Y N pe
jωn
q
loooomoooon “ ypkqe « ypkqe´jωn k “ Y pejωn q
k“0 k“´8
length-N DFT
´1
Nÿ 8
ÿ
´jωn k
jωn
UN pe q
loooomoooon “ upkqe « upkqe´jωn k “ U pejωn q
k“0 k“´8
length-N DFT
jωn YN pejωn q
ĜN pe q
loooomoooon :“
UN pejωn q
ETFE
2018-10-5 3.8
ETFE example: “experimental” data
3.0 upkq
2.0
1.0
0 index: k
32 64 128
´1.0
´2.0
´3.0
3.0 ypkq
2.0
1.0
0 index: k
32 64 128
´1.0
´2.0
´3.0
2018-10-5 3.9
40 UN pejωn q
20
ω
0 (rad/sample)
0 π{2 π
Magnitude
60
40 YN pejωn q
20
ω
0 (rad/sample)
0 π{2 π
2018-10-5 3.10
ETFE example
0.1
0.01
200
ĜN pejωn q
100
log ω
0 (rad/sample)
0.1 1 π
´100
´200
Phase (deg.)
2018-10-5 3.11
ETFE example
0.1
0.01
200
ĜN pejωn q
100
log ω
0 (rad/sample)
0.1 1 π
Gpejω q
´100
´200
Phase (deg.)
2018-10-5 3.11
ETFE example: absolute error
Magnitude
10
Gpejω q
log ω
1 (rad/sample)
0.1 1 π
0.1
EN pejωn q
0.01
2018-10-5 3.12
2018-10-5 3.13
Empirical transfer function estimation
2018-10-5 3.14
Bias properties
YN pejωn q VN pejωn q
ĜN pejωn q “ “ Gpe jωn
q `
UN pejωn q UN pejωn q
And we find the bias by examining,
" *
jωn jωn VN pejωn q
EtĜN pe qu “ Gpe q`E
UN pejωn q
“ Gpejωn q (assumes zero mean noise)
2018-10-5 3.15
ETFE error properties:
Variance properties
8
ÿ
where |c| ď C “ |τ Rv pτ q| is assumed to be finite.
τ “1
2018-10-5 3.16
10 Magnitude
0.001 0.01 0.1 1 log ω
1 (rad/sec)
0.1
0.01
0.001
0.0001 model: P
´180
´270
´360
´450 model: P
Phase (deg.)
2018-10-5 3.17
ETFE example — taking more data
0.1
0.01
Magnitude
10
EN pejωn q, N “ 128 Gpejω q
log ω
1 (rad/sample)
0.1 1 π
0.1
0.01
2018-10-5 3.18
0.01
Magnitude
10
EN pejωn q, N “ 128 Gpejω q
EN pejωn q, N “ 256 log ω
1 (rad/sample)
0.1 1 π
0.1
0.01
2018-10-5 3.18
ETFE example — taking more data
0.01
Magnitude
10
EN pejωn q, N “ 128 Gpejω q
EN pejωn q, N “ 256 log ω
1 (rad/sample)
EN pejωn q, N “ 512 0.1 1 π
0.1
0.01
2018-10-5 3.18
Magnitude
10
EN pejωn q, N “ 128 Gpejω q
EN pejωn q, N “ 256 log ω
1 (rad/sample)
EN pejωn q, N “ 512 0.1 1 π
EN pejωn q, N “ 1024
0.1
0.01
2018-10-5 3.18
ETFE example: mean-square error
Magnitude
10
Gpejω q
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 128 log ω
1 (rad/sample)
0.1 1 π
0.1
0.01
0.001
2018-10-5 3.19
Magnitude
10
Gpejω q
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 128 log ω
1 (rad/sample)
0.1 1 π
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 256
0.1
0.01
0.001
2018-10-5 3.19
ETFE example: mean-square error
Magnitude
10
Gpejω q
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 128 log ω
1 (rad/sample)
0.1 1 π
ˇ ˇ(
E ˇˇEN pejωn qˇˇ( , N “ 256
E ˇEN pejωn qˇ , N “ 512
0.1
0.01
0.001
2018-10-5 3.19
Magnitude
10
Gpejω q
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 128 log ω
1 (rad/sample)
0.1 1 π
ˇ ˇ(
E ˇˇEN pejωn qˇˇ( , N “ 256
E ˇˇEN pejωn qˇˇ( , N “ 512
0.1 E ˇEN pejωn qˇ , N “ 1024
0.01
0.001
2018-10-5 3.19
ETFE example: mean-square error
Magnitude
10
Gpejω q
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 128 log ω
1 (rad/sample)
0.1 1 π
ˇ ˇ(
E ˇˇEN pejωn qˇˇ( , N “ 256
E ˇˇEN pejωn qˇˇ( , N “ 512
0.1 E ˇEN pejωn qˇ , N “ 1024
0.01
|Hpejω q|2
0.001
2018-10-5 3.19
Transient responses:
rpkq
2018-10-5 3.20
Transient responses
5 uperiodic pkq
index: k
´32 0 32 64
´5
5 yperiodic pkq
index: k
´32 0 32 64
´5
5 upkq
index: k
´32 0 32 64
´5
5 ypkq
index: k
´32 0 32 64
´5
5 rpkq “ yperiodic pkq ´ ypkq
index: k
´32 0 32 64
´5
2018-10-5 3.21
Transient responses
Periodic signal with period M “ 32.
rpkq for k “ 0, . . . , N ´ 1. (N “ mM )
0.5 rpkq
0.25
m“1
m“4 m “ 16
0 index: k
32 128 512
´0.25
´0.5
2018-10-5 3.22
Transient responses
RN pejωn q for n “ 0, . . . , N ´ 1. (N “ mM )
Magnitude
10
log ω
1 (rad/sample)
0.1 1 π
RN pejωn q, m “ 1
0.1
2018-10-5 3.23
Transient responses
RN pejωn q for n “ 0, . . . , N ´ 1. (N “ mM )
Magnitude
10
log ω
1 (rad/sample)
0.1 1 π
RN pejωn q, m “ 1
RN pejωn q, m “ 4
0.1
2018-10-5 3.23
Transient responses
RN pejωn q for n “ 0, . . . , N ´ 1. (N “ mM )
Magnitude
10
log ω
1 (rad/sample)
0.1 1 π
RN pejωn q, m “ 1
RN pejωn q, m “ 4
RN pejωn q, m “ 16
0.1
2018-10-5 3.23
4 upkq
m“1 m“4 m “ 16
0 index: k
32 128 512
´2
´4
2018-10-5 3.24
Transient responses: input excitation
Magnitude
m“1
m“4
200 m “ 16
150
100
50
0 ω (rad/sample)
0 π{2 π
2018-10-5 3.25
As N “ mM , m ÝÑ 8, As N ÝÑ 8,
ˇ ˇ ˇ ˇ !ˇ ˇ) ? a
ˇ jωn ˇ ˇ jωn ˇ ˇ jωn ˇ
ˇUN pe qˇ “ m ˇUM pe qˇ E ˇUN pe qˇ ÝÑ N φu pejωn q
ˇ ˇ #
ˇ RN pejωn q ˇ 1{N for periodic inputs; or
So ˇˇ ˇ ÝÑ 0 with rate ?
UN pejωn q ˇ 1{ N for random inputs
2018-10-5 3.26
Transient response example
Magnitude
10
Gpejω q
log ω
1 (rad/sample)
1 π
ĜN pejωn q, m “ 1
0.1
ĜN pejωn q, m “ 4
ĜN pejωn q, m “ 16
0.01
2018-10-5 3.27
Magnitude
10
Gpejω q
log ω
1 (rad/sample)
1 π
EN pejωn q, m “ 1
0.1 EN pejωn q, m “ 4
0.01
EN pejωn q, m “ 16
0.001
2018-10-5 3.28
ETFE example: average error with noisy data
Magnitude
10
Gpejω q
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 128 log ω
1 (rad/sample)
0.1 1 π
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 256
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 512
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 1024
0.1
0.01
|Hpejω q|2
0.001
2018-10-5 3.29
Magnitude
10
Gpejω q
log ω
1 (rad/sample)
0.1 1 π
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 128
0.1
0.01
|Hpejω q|2
0.001
2018-10-5 3.30
ETFE example: average error with periodic signals
Magnitude
10
Gpejω q
log ω
1 (rad/sample)
0.1 1 π
ˇ ˇ(
E ˇˇEN pejωn qˇˇ( , N “ 128
E ˇEN pejωn qˇ , N “ 256
0.1
0.01
|Hpejω q|2
0.001
2018-10-5 3.30
Magnitude
10
Gpejω q
log ω
1 (rad/sample)
0.1 1 π
ˇ ˇ(
E ˇˇEN pejωn qˇˇ( , N “ 128
E ˇˇEN pejωn qˇˇ( , N “ 256
0.1 E ˇEN pejωn qˇ , N “ 512
0.01
|Hpejω q|2
0.001
2018-10-5 3.30
ETFE example: average error with periodic signals
Magnitude
10
Gpejω q
log ω
1 (rad/sample)
0.1 1 π
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 128
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 256
ˇ ˇ(
E ˇEN pejωn qˇ , N “ 512
0.1 ˇ ˇ(
E ˇEN pejωn qˇ , N “ 1024
0.01
|Hpejω q|2
0.001
2018-10-5 3.30
epkq
Hpejω q
vpkq
ypkq upkq
` Gpejω q
If vpkq “ 0 then,
If vpkq ‰ 0 then,
2018-10-5 3.31
Spectral transformations (random signals)
epkq
Hpejω q
vpkq
ypkq upkq
` Gpejω q
Cross-correlation result:
“ Gpejω q φu pejω q
(if upkq and vpkq are uncorrelated)
φ̂yu pejω q
Ĝpejω q “
φ̂u pejω q
2018-10-5 3.32
φ̂yu pejωn q
Ĝpejωn q “
φ̂u pejωn q
2018-10-5 3.33
Spectral estimation (via the periodogram)
1 ÿ
N
lim |τ Rv pτ q| “ 0
N ÝÑ8 N
τ “´N
2018-10-5 3.34
´1
Nÿ
jω
Spectral estimate: φ̂v pe q “ R̂v pτ qe´jωτ
τ “´N `1
2018-10-5 3.35
Spectral estimation (periodic signals)
M ´1
1 ÿ
Rx pτ q “ xpkqxpk ´ τ q (using a periodic calculation)
M k“0
The power spectral density can be calculated exactly and is also equal to the
periodogram.
ÿ
M ´1
1
φx pe jωn
q“ Rx pτ qe´jωn τ “ |XM pejωn q|2
τ “0
M
2018-10-5 3.36
N ´ |τ |
Random xpkq: biased EtR̂x pτ qu “ Rx pτ q
N
asymptotically unbiased
(as N ÝÑ 8, τ {N ÝÑ 0)
2018-10-5 3.37
Bibliography
Spectral estimation:
Lennart Ljung, System Identification; Theory for the User, (see Section 6.4)
Prentice-Hall, 2nd Ed., 1999.
P. Stoica & R. Moses, Introduction to Spectral Analysis (see Chapters 1 and 2),
Prentice-Hall, 1997.
2018-10-5 3.38
Averaging
Multiple estimates
Yr pejωn q
Ĝr pejωn q “ (drop the N from the YN pejωn q notation)
Ur pejωn q
Averaging to improve the estimate.
jωn
ÿ
R
jωn
ÿ
R
Ĝpe q“ αr Ĝr pe q, with αr “ 1.
r“1 r“1
How to choose αr ?
2018-10-10 4.2
Averaging
This is minimized by
1{σr2 pejωn q
αr pejωn q “ .
ÿ
R
1{σr2 pejωn q
r“1
2018-10-10 4.3
Averaging
Variance reduction
ˇ ˇ
Best result: ˇUr pejωn qˇ is the same for all r “ 1, . . . , R.
This gives,
´ ¯
´ ¯ variance Ĝr pejωn q
variance Ĝpejωn q “ .
R
If the estimates are biased we will not get as much variance reduction.
(The method of splitting the data record and averaging the periodograms is attributed
to Bartlett (1948, 1950).)
2018-10-10 4.4
Averaging example (noisy data)
Splitting the data record: K “ 72, R “ 3, N “ 24,
5
upkq
0
´5 index: k
´20 ´10 0 10 20 30 40 50 60 70
5
ypkq
0
´5 index: k
´20 ´10 0 10 20 30 40 50 60 70
5
y1 pkq
0
´5 index: k
´20 ´10 0 10 20 30 40 50 60 70
5
y2 pkq
0
´5 index: k
´20 ´10 0 10 20 30 40 50 60 70
5
y3 pkq
0
´5 index: k
´20 ´10 0 10 20 30 40 50 60 70
2018-10-10 4.5
Magnitude
10
Gpejω q
Ĝavg pejωn q
Ĝ3 pejωn q
log ω
1 (rad/sample)
1 π
Ĝ1 pejωn q
Ĝ2 pejωn q
0.1
2018-10-10 4.6
Averaging example (noisy data)
Estimates: Ĝr pejωn q, r “ 1, 2, 3 and the weighted average, Ĝavg pejωn q.
Magnitude
5
E2 pejωn q E1 pejωn q
log ω
1 (rad/sample)
1 π
E3 pejωn q
0.1
Eavg pejωn q
0.01
2018-10-10 4.7
Magnitude
10
Gpejω q
Ĝavg pejωn q
Ĝ2 pejωn q
log ω
1
Ĝ3 pejωn q 1 π (rad/sample)
0.1
2018-10-10 4.9
Magnitude
5
log ω
1 (rad/sample)
1 π
E2 pejωn q
0.1
E3 pejωn q
Eavg pejωn q
0.01
2018-10-10 4.10
Averaging with periodic signals
´ ¯
Estimates: Ĝr pe jωn
q and Ĝavg pe jωn
q “ Ĝ2 pe jωn
q ` Ĝ3 pe jωn
q {2
Magnitude
5
log ω
1 (rad/sample)
Eavg pejωn q 1 π
(non-periodic)
E2 pejωn q
0.1
E3 pejωn q
Eavg pejωn q
0.01
2018-10-10 4.10
jωn
ÿ
R´1
jωn
ÿ
R´1
Ŷr pejωn q
G̃pe q“ αr Ĝr pe q“ αr .
r“0 r“0 Ûr pejωn q
Error
MSE
Bias error
Variance error
1 2 number of records, R
2018-10-10 4.12
! )
jωn jωn jωs jωs
E pGpe q ´ ĜN pe qqpGpe q ´ ĜN pe qq ÝÑ 0 pn ‰ sq,
(asymptotically at least).
2018-10-10 4.13
Smoothing the ETFE
2018-10-10 4.14
ÿ
R
αr ĜN pejωn`r q
r“´R
G̃N pejωn q “
ÿ
R
αr
r“´R
ż ωn`r
αpejξ qĜN pejξ q dξ ˇ ˇ
ωn´r
1 ˇUN pejξ qˇ2
« ż ωn`r , with αpejξ q “ N
.
jξ φv pejξ q
αpe q dξ
ωn´r
2018-10-10 4.15
Smoothing the ETFE
Smoothing window
żπ
1
Wγ pejpξ´ωn q qαpejξ qĜN pejξ q dξ
2π ´π
G̃N pejωn q “ ż ,
1 π jpξ´ωn q jξ
Wγ pe qαpe q dξ
2π ´π
ˇ ˇ
1 ˇUN pejξ qˇ2
with αpejξ q “ N
.
φv pejξ q
The 1{2π scaling will make it easier to derived time-domain windows later.
2018-10-10 4.16
Assumptions on φv pejω q
Then use,
ˇ ˇ
1 ˇUN pejξ qˇ2
αpejξ q “ N
,
φv pejωn q
to get,
żπ ˇ ˇ2
1 1 ˇ
jpξ´ωn q jξ ˇ
Wγ pe q ˇUN pe qˇ ĜN pejξ q dξ
2π ´π N
G̃N pejωn q “ żπ ˇ ˇ2 .
1 jpξ´ωn q 1 ˇ jξ ˇ
Wγ pe q ˇUN pe qˇ dξ
2π ´π N
2018-10-10 4.17
Weighting functions
Wγ pejωn q
20
γ “ 20
15
10
N “ 256 γ “ 10
5
γ“5
γ“1
´π ´π{2 0 π{2 π
Discrete frequency: ω
2018-10-10 4.18
Weighting functions
2018-10-10 4.19
Weighting functions
The differences are mostly in the leakage properties of the energy to adjacent
frequencies. And the ability to resolve close frequency peaks.
2018-10-10 4.20
Weighting functions
Welch
γ “ 10 10 Hann
N “ 256 Hamming
Bartlett
5
´π ´π{2 0 π{2 π
Discrete frequency: ω
2018-10-10 4.21
ETFE smoothing example: Matlab calculations
U = fft(u); % calculate N point FFTs
Y = fft(y);
Gest = Y./U; % ETFE estimate
Gs = 0*Gest; % smoothed estimate
for wn = 1:N,
Wnorm = 0; % reset normalisation
for xi = 1:N,
widx = mod(xi-wn,N)+1; % wrap window index
Gs(wn) = Gs(wn) + ...
Wg(widx) * Gest(xi) * a(xi);
Wnorm = Wnorm + Wg(widx) * a(xi);
end
Gs(wn) = Gs(wn)/Wnorm; % weight normalisation
end
2018-10-10 4.22
Window properties
żπ
1
Wγ pejξ qdξ “ 1 (Normalised)
2π ´π
żπ
ξWγ pejξ qdξ “ 0 (“Even” sort of)
´π
żπ
M pγq :“ ξ 2 Wγ pejξ qdξ (bias effect)
´π
żπ
W̄ pγq :“ 2π Wγ2 pejξ qdξ (variance effect)
´π
2.78
Bartlett: M pγq “ , W̄ pγq « 0.67γ (for γ ą 5)
γ
π2
Hamming: M pγq “ , W̄ pγq « 0.75γ (for γ ą 5)
2γ 2
2018-10-10 4.23
Smoothed estimate properties
! ! )) ! )
jωn jωn jωn jωn
E G̃pe q ´ E Gpe q “ E G̃pe q ´ Gpe q
ˆ 1
˙
1 jωn φu pe q
jωn
1 2 jωn
“ M pγq G pe q ` G pe q
2 φu pejωn q
´ ? ¯
` OpC pγqq
loooomoooon
3 ` O 1{ N
looooomooooon
ÝÑ 0 ÝÑ 0
as γ ÝÑ 8 as N ÝÑ 8
Increasing γ:
§ makes the frequency window narrower;
§ averages over fewer frequency values;
§ makes M pγq smaller; and
§ reduces the bias of the smoothed estimate, G̃pejωn q.
2018-10-10 4.24
"´ ! )¯2 *
jωn jωn 1 φv pejωn q
E G̃pe q ´ E G̃pe q “ W̄ pγq
N φu pejωn q
` ˘
` W̄ pγq{N
oloooooomoooooon
ÝÑ 0
as γ ÝÑ 8
N ÝÑ 8
γ{N ÝÑ 0
Increasing γ:
§ makes the frequency window narrower;
§ averages over fewer frequency values;
§ makes W̄ pγq larger; and
§ increases the variance of the smoothed estimate, G̃pejωn q.
2018-10-10 4.25
Smoothed estimate properties
1 2 jωn φ1 pejωn q
where F pejωn q “ G pe q ` G1 pejωn q u jω
2 φu pe n q
2018-10-10 4.26
Bibliography
Lennart Ljung, System Identification; Theory for the User, (see Section 6.4)
Prentice-Hall, 2nd Ed., 1999.
M.S. Bartlett, “Periodogram analysis and continuous spectra,” Biometrika, vol. 37,
pp. 1–16, 1950.
2018-10-10 4.27