0% found this document useful (0 votes)
76 views39 pages

Turbulence: Lecture Note 2009 Fall Parallel Computing Lab. Hanyang Univ

The document discusses statistical tools used to analyze turbulent flows, including: 1. Defining random variables and ensemble averages to characterize turbulent quantities. 2. Calculating the variance and higher moments of random variables to describe their distribution about the mean. 3. Using histograms and probability density functions to characterize the amplitude distribution of turbulent signals in the limit of many samples. 4. Properties of the probability density function (PDF) and cumulative distribution function (CDF), including that the PDF is non-negative, integrates to 1, and the CDF is monotonic increasing.

Uploaded by

Chul-Kyu Kim
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
76 views39 pages

Turbulence: Lecture Note 2009 Fall Parallel Computing Lab. Hanyang Univ

The document discusses statistical tools used to analyze turbulent flows, including: 1. Defining random variables and ensemble averages to characterize turbulent quantities. 2. Calculating the variance and higher moments of random variables to describe their distribution about the mean. 3. Using histograms and probability density functions to characterize the amplitude distribution of turbulent signals in the limit of many samples. 4. Properties of the probability density function (PDF) and cumulative distribution function (CDF), including that the PDF is non-negative, integrates to 1, and the CDF is monotonic increasing.

Uploaded by

Chul-Kyu Kim
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

Turbulence

TURBULENCE

Lecture Note

2009 Fall

Parallel Computing Lab.


Hanyang Univ.

http://vortex.hanyang.ac.kr

Parallel Computing Lab. Hanyang Univ. 1


Turbulence

Part II. Statistical Tools for Turbulent Flow


• A random variable u is defined by :
A function which assigns a value u (ξ ) to every outcome ξ of the experiment
Random variable u must have the same probability of generating a given outcome -> identically distributed

• An ensemble average is defined by :


((true average)
g )
1 N
u = lim ∑ ui
N → ∞ N i =1

But, in reality we never have an infinite # of ui ‘s


→ can never compute ensemble average
We can define an estimator for the average based on a finite # of ui ‘s
1 N
u N = ∑ ui ← itself a r.v.
N i =1
Questions : ① Is this estimator unbiased?
(Dose it converge to correct answer?)

② Does it converge at all?

Parallel Computing Lab. Hanyang Univ. 2


Turbulence

• It is important
p to be able to tell how a r.v. is distributed about the mean ((ensemble average)
g )
Define : Variance of r.v. u

var{u} ≡ σ u2 ≡ (u − u )
2

σ u : standard deviation of u
variance is also called 2nd central moment
Suppose 2 r.v.’s are identically distributed, these must have the same variance
C d
Can define
fi hihigher
h moments
t
N
1
nth moment : u ≡ Nlim
n
→∞ N
∑u
j =1
n
j

n=1 : mean
n=2 : mean square
central nth moment (first substract mean value)
N
1
(u − u ) n ≡ lim
N →∞ N
∑ (u
j =1
j − u )n

• How does var{u} relate to u 2 ?

Splitting u into mean and fluctuation part


u = u + (u − u )
mean fluc.

u 2 = [u + (u − u )]
2

[
= u 2 + 2u (u − u ) + (u − u ) 2 ]

Parallel Computing Lab. Hanyang Univ. 3


Turbulence

cf. Easyy to see from def. of ensemble ave. that arithmatic operations
p and averaging
g g oparations
p commute
N N
1 1
e.g. u + v = lim
N →∞ N
∑ ui + lim
i =1
N →∞ N
∑v
i =1
i

N
1
= lim
N →∞ N
∑ (u
i =1
i + vi ) = (u + v)

⇒ u 2 = u 2 + 2u (u − u ) + (u − u ) 2
= u 2 + 2u ⋅ 0 + var{u}
(mean square) = (square of man) + variance

or var{u} = u 2 − u 2

same mean , different variance

windows

same mean , same variance

Parallel Computing Lab. Hanyang Univ. 4


Turbulence

• Can we characterize the amplitude


p distribution of signal?
g
- make a frequency of “occurance” diagram
Consider N samples
How many sample fall in a particular window
#

level

Let # of realizations increase while window size remain same

Δu

Parallel Computing Lab. Hanyang Univ. 5


Turbulence

• Histogram : H (u, Δu ) = limit of frequency of occurance as N → ∞

lim H (u, Δu ) = 0
Δu →0
H (u , Δu )
But Δlim = B(u ) : Probability density function
u →0Δu
( d f)
(p.d.f)
N →∞
← a double limit
Δu → 0
li iti
limiting curve off hi
histogram
t

• Properties of PDF

① B(u) ≥ 0

② Prob {c ≤ u ≤ c+dc} ≡ B(c)dc


- follows from def. of B(c) from histogram in limit as Δc → 0, N → ∞

B(c)

c
c dc

Parallel Computing Lab. Hanyang Univ. 6


Turbulence

c
③ p{u ≤ c} = ∫ B(u )du ≡ F (c) : Probability Distribution Function
−∞

⇒ p{c1 ≤ u ≤ c2 } = F (c2 ) − F (c1 )


d
B (c ) = F (c ) F(c)
dc

B(c)
④ ∫−∞
B(c)dc = 1
c
⇒ F (∞) = 1, F (−∞) = 0
F(c) is a monotonic non-decreasing
non decreasing function

Ex) 1. Drive pdf for a sine wave.


2. Drive pdf for a triangle wave.

3. Drive pdf for random square wave.

4. What happens to 1. 2. 3. if frequency of signal is changed?

• Evaluation of moments from pdf


Ensemble average :
N
1
∑u
1
u = lim i = lim ( ∑ ui + ∑u i + ⋅ ⋅ ⋅ ⋅⋅)
N →∞ N N
i =1
c1 < u < c1 + Δc c1 + Δc < u < c1 + 2Δc

= ∫ uB(u )du
−∞

← probability
b bilit average and
d ensemble
bl average are th
the same

Parallel Computing Lab. Hanyang Univ. 7


Turbulence

nth moment

u n = ∫ u n B(u )du
−∞

nth central moment



(u − u ) n = ∫ (u − u ) n B(u )du
−∞


variance ≡ var{u} = ∫−∞
(u − u ) n B(u )du ≥ 0 (cf. B (u ) ≥ 0 )


third central moment = ∫−∞
(u − u ) 3 B (u )du

1 1
B (u ) = [ B(u ) + B (−u )] + [ B(u ) − B(−u )]
2 2
even odd

- variance can be zero only when B (u ) = δ (u ) (steady signal or all values exactly same)

- third
hi d centrall moment iis zero if B(
B(u)) iis even, can b
be non-zero only
l if B(
B(u)) h
has an odd
dd part.

- As a measure of symmetry of pdf

(u − u ) 3
Skewness ≡ S ≡ 3
[var{u}] 2 ← nondimensionalize

Parallel Computing Lab. Hanyang Univ. 8


Turbulence

S < 0 ⇒ pdf skewed to right

Contribution of (u − u ) 3 is
bigger when u < 0

- 4th central moment



(u − u ) 4 = ∫ (u − u ) 4 B(u )du
−∞

a measure of peakness (flatness) :


(u − u ) 4
Kwrtosis : K ≡
[var{u}]2
small K

large K

cf. Flatness factor : F ≡ K − 3


Kwrtosis of Gaussian p.d.f is 3
(Normal p.d.f)
( u− u ) 2
1 −
BG (u ) = e 2a2

2π σ

Parallel Computing Lab. Hanyang Univ. 9


Turbulence

Bivariate random variables


the joint statistics – how they interact with each other
v

• Joint
J i t moments
t
Consider two r.v.`s u&v

u 2 , v2 - single

(u − u ) 2 - variance u

uv 2 , u 2 v
joint •(u , v )
1 1
moments
(u − u )(v − v ) ← (cross - ) correlation
(cross - ) covariance

u
u1

v
v1

Parallel Computing Lab. Hanyang Univ. 10


Turbulence

Joint probability density function

B (u , v) = lim H (u , v, Δu , Δv)
N →∞
Δu →0
Δv →0

Properties of JPDF

① B (u , v) ≥ 0

② ∫ ∫ B(u, v)dudv = 1
−∞

③ p{u ≤ u1 , v ≤ v1} = F (u, v) : joint prob. distribution function

∂ 2 F (u, v)
B (u, v) =
∂u∂v
Fu = F (u, ∞)
Fv = F (∞, v)

④ marginal PDF = single variable P.D.F


∞ ∂Fu
Bu (u ) = ∫ B(u , v)dv =
−∞ ∂u
∂u
∞ ∂Fv
Bv (v) = ∫ B(u , v)du =
−∞ ∂v

Parallel Computing Lab. Hanyang Univ. 11


Turbulence

B (u , v )

u1
u

v1

Suppose we want the statistics one var. given, a particular value of the other
- conditional prob. Say the particular value of v is v1
( profiles )

B(u , v1 ) = B(u / v = v1 )
u = ∫ ∫ uB(u, v)dudv = ∫ du ⋅ u ⎡ ∫ dvB(u, v)⎤ = ∫ uBu (u )du
∞ ∞ ∞ ∞

−∞ −∞ ⎢⎣ −∞ ⎥⎦ −∞

Parallel Computing Lab. Hanyang Univ. 12


Turbulence

• m-nth joint moment



u m v n = ∫ ∫ u m v n B(u, v)dudv
−∞

• m-nth joint central moment



(u − u ) m (v − v ) n = ∫ ∫ (u − u ) m (v − v ) n B(u , v)dudv
−∞

for m=1, n=1



Cuv = (u − u )(v − v ) = u′v′ = ∫ ∫ (u − u )(v − v ) B(u, v)dudv
−∞
(cross) correlation
← is equivalent to the product of inertia
thus, a measure of asymmetry of B(u, v)

Suppose u ′v′ = 0
then we say u & v (or u ′ & v′) are uncorrelated
0 0
Note uv = (u + u )(v + v) = u v + u ′v + v′u + u ′v′

uncorrelated : uv = u v

← If two r.v.`s have a non-zero mean


then the average of the product will always
yield the product of their averages,
even if they are uncorrelated.

Therefore the product of mean values has


nothing to do with whether or not the rr.v.
v `s
s
are correlated, only fluctuation part is of interested

Parallel Computing Lab. Hanyang Univ. 13


Turbulence

Define (cross-)correlation coefficient


u ′v ′ u ′v ′
ρ uv ≡ =
var{u} var{v} σ uσ v

if u & v perfectly correlated : ρ uv ≡ 1

uncorrelated : ρ uv ≡ 0

so must always ρ uv ≤ 1

perfectly anti-correlated : ρ uv = −1

Parallel Computing Lab. Hanyang Univ. 14


Turbulence

Statistical Independence
p
: occurrence of u in no way affects the probability of occurrence of v, and conversely

⇒ B(u, v) = Bu (u ) Bv (v)

JPDF = product of MPDF’s


Questions

1. If two r.v.’s are statisticallyy independent,


p , are they
y always
y uncorrelated? yyes

pf.) (u − u )(v − v ) = ∫ ∫ B(u , v)(u − u )(v − v )dudv
−∞
∞ ∞ ∞
= ∫ ∫ Bu (u ) Bv (v)(u − u )(v − v )dudv = ∫ (u − u ) Bu (u )du ⋅ ∫ (v − v ) Bv (v)dv = 0
−∞ −∞ −∞

=0 =0

2. If two r.v.’s are uncorrelation, are they necessarily statististical independence?

Assume two r.v.’s u & v with u′v′ ≠ 0 . These are clearly not statistical independence
No combination of these two r.v.’s can be statistical independence.
However, we can find a combination which has zero correlation.

e.g.)) x = u + v, y = u − v
⇒ x′ = u ′ + v′, y′ = u ′ − v′
⇒ x′y′ = (u ′ + v′)(u ′ − v′) = u ′2 + u ′v′ − u ′v′ − v′2
= u ′ 2 − v′ 2

Parallel Computing Lab. Hanyang Univ. 15


Turbulence

x & y are not statistical independence since u & v are not. If u′2 = v′2 , then x′y′ = 0 and
there is no correlation in spite of statistical independence.

∴ Statistical independence Uncorrelated uncorrelated

statistical independence

cf.
f Bi
Bivariate
i t NNormal(Gaussian)
l(G i ) Di
Distribution
t ib ti

Suppose u & v are normally distributed r.v.’s with standard deviation given by σ u & σ v ,
and with correlation coeff.

(u − u )(v − v )
ρuv =
σ uσ v
Either r.v. taken separately has a Gaussian marginal distribution.
( u−u ) 2

1 2σ u 2
e.g.) Bu (u ) = e
2π σ u

Bivariate Normal P.D.F


1 ⎡ 1 ⎧ (u − u 2 ) 2 ρ (u − u )(v − v ) (v − v 2 ) ⎫⎤
B (u , v) = exp ⎢− − +
2πσ uσ v 1 − ρ uv (2 ⎨
)
⎢⎣ 2 1 − ρ uv ⎩ σ u
2
σ uσ v σ v2 ⎭⎥⎦
⎬⎥

cf. Central limit theorem (Law of large numbers)

Parallel Computing Lab. Hanyang Univ. 16


Turbulence

Estimation of statistical p
properties
p from a finite number of realization
- never have infinite number of realizations
- so how good are our estimatiors based on a finite number

Question : Does it converge to right answer?


i.e. u N → u as N →∞
taking the average of the estimator, e.g. u N

M
1
∑u

lim Nk = uN or u N = ∫ u N Bu N (u N )du N
M →∞ M −∞
k =1

sampling P.D.F

Bias : Estimator is unbiased if the average of estimator yields the true average
no systematic error
N
converge
g to correct value
1 1 1
uN =
N
∑u
i =1
i =
N
∑u i =
N
⋅ Nu = u

∴ This estimator is unbiased


recall u N is random also

(
var{u N } ≡ u N − u N )
2

Does var{u N } → 0 as N →∞ ?

Parallel Computing Lab. Hanyang Univ. 17


Turbulence

Define variability of estimator :


var{u N }
ε2 ≡
u2
if ε→0 as N→∞, then estimator converges to true value
can show that
1 var{u}
ε2 =
N u2

i.e. variability of estimator is equal to variability of r.v. itself divided by the number of independent
realizations
li ti
1 σu
error : ε =
N u

1
N

u N

Bu N

Parallel Computing Lab. Hanyang Univ. 18


Turbulence

Ex) Coin flip experiment

How many flips are required to measure expected value of ½ to within 1% error?
1
u=
2
var{u} = (u − u ) 2 =
1 ⎧0
since u=⎨
4 ⎩1
var{u N }
2
= 0.01 ⇒ ε 2 = 10 − 4
u
1
1 ⎛σu ⎞
2
1
N = 2 ⎜ ⎟ = − 4 4 2 = 10 4
ε ⎝ u ⎠ 10 ⎛ 1 ⎞
⎜ ⎟
⎝ 2⎠

How well can we do with 100 flips?


1 σu 1
ε= = ⋅1 = 0.1 = 10%
N u 100

Parallel Computing Lab. Hanyang Univ. 19


Turbulence

Stationary
y Random Process (stochastic
( Process))
u(t)

For stationary random process, its PDF and moments are time-independent (independent of the origin of time).

This will only approximate a real process, since stationary random process must go on forever.

Parallel Computing Lab. Hanyang Univ. 20


Turbulence

Ensemble average
g
1 T
u = lim
T →∞ T ∫
0
u (t ) dt

can define the estimator (time average)


1 T
uT = ∫ u (t ) dt ← itself random
T 0

1 T 1 T
uT =
T ∫
0
u (t ) dt =
T ∫
0
u (t ) dt

for stationary random process

u (t ) = u = const
1 T
⇒ uT = u ∫ dt = u ← uT is unbiased estimator
T 0

Does var{ut } → 0 as T → ∞ ? ( uT → u as T → ∞ ?)

var{uT } = (uT − u ) 2 = (uT − uT ) 2


2 2
⎡1 T ⎤ ⎡1 T ⎤
= ⎢ ∫ u (t )dt − uT ⎥ = ⎢ ∫ (u (t ) − u )dt ⎥
⎣T 0
⎦ ⎣T 0

1 T T
= 2 ∫ ∫ [u (t ) − u ][u (t ' ) − u ] dtdt '
T 0 0
[u (t ) − u ][u (t ' ) − u ] = u ' (t )u ' (t ' ) = C (τ ) : autocorrelation

where τ = t '−t

Parallel Computing Lab. Hanyang Univ. 21


Turbulence

Why is u ' (t )u ' (t ' ) a function of τ = t '−t only?

This can not depend on time itself (i.e. t) since the process is assumed stationary.
Why does C (τ ) → 0 as τ → ∞ ?

c(τ) u(t)

τ
τ t

u ' (t ) and u ' (t + τ ) becomes uncorrelated as τ → ∞

Stationary random process “forget”


forget how they used to be
be.

- they have finite memories – They become uncorrelated with themselves.

Parallel Computing Lab. Hanyang Univ. 22


Turbulence

What is a reasonable measure of how long a process is correlated with itself?

Define integral scale as a measure of the memory of process

c(τ)
2
u/

I τ


I ⋅ c(0) = ∫ c(τ )dτ
0

c(0 ) = var{u} = u / = const


2
for stationary

Autocorrelation coefficient :
c(τ ) u / (t )u / (t + τ )
ρ (τ ) = =
c(0) u/
2


I = ∫ ρ (τ )dτ
0

Parallel Computing Lab. Hanyang Univ. 23


Turbulence

Not all process have integral scale



if ∫
0
ρ (τ )dτ = 0 then I = 0

ρ(τ)

Ieff 2

or Ieff1 = ∫ ρ(τ )dτ
0

1 T T
2 ∫0 ∫0
Back to var{uT } , var{uT } = c(t − t / )dt / dt
T
var{u} T T
=
T 2 ∫ 0 ∫0
ρ (t / − t )dtdt /

2 var{u} T ⎛ τ⎞
After a partial integration , var{uT } =
T ∫0
ρ (τ )⎜1 − ⎟dτ
⎝ T⎠
⎛ τ⎞
Since ρ (τ )⎜1 − ⎟ → ρ (τ ) as T → ∞
⎝ T⎠
2 var{u} T 2I
var{uT } ≅
T ∫0
ρ (τ )dτ =
T
var{u} (for T >>I )

Parallel Computing Lab. Hanyang Univ. 24


Turbulence

var((uT ) 2τ var{{u}
cf.
f variability
i bilit ε2 ≡ ≅
u2 T u2
Therefore uT → u as T → ∞
Hence for stationary random processes we don’t need to perform many experiments to determine statistics.
Compare 1 var{u}
ε2 = : N independent realization
N u2
2τ var{u}
ε2 = : I integral scale, T record time
N u2
Obviously the effective number of independent realization is
T
N eff =
2I

The segments of our time record of two integral scales in length contribute to the average
as if they were statistically independent.

u(t)

t
2I

Parallel Computing Lab. Hanyang Univ. 25


Turbulence

ex) Jet mixing layer

2δ u ' ( x)u ' ( x + r )

D
UE

l
δ ≅ 0.1x
Spatial integral scale
l ≅ 0.04x
0 04x

l
I = where U c = local convection velocity
Uc
U c ≅ 0.6U E U E = 20 m/s D=0
D=0.1m
1m

Suppose we measure at x/D = 3


0.04 D x
I ≅ = 0.001 or I ≅ 1 m sec
0.6 U E D

How long must we measure to obtain mean velocity to within 1%


uT − u
0.99 < < 1.01
u

Parallel Computing Lab. Hanyang Univ. 26


Turbulence

u/ = 0.15UE σ
≅ 0.25
Uc = 0.6UE u

2τ ⎛ σ ⎞
2

ε = ⎜ u⎟
2

T ⎝ u ⎠

2τ ⎛ σ ⎞
2
2(0.001)
T≥ 2⎜ u⎟ = (0.25) 2 = 1.25 s
ε ⎝u ⎠ (0.01) 2

e.g. plume

l ≅ 0.04 x
U c ≅ 0.5 m / s

The integral scale, because of its definition, is mostly concerned with the scales containing the
most of energy (variance ⇒ energy). So we could identify the scales which are much larger or
smaller than those containing the energy.

Parallel Computing Lab. Hanyang Univ. 27


Turbulence

Taylor
y microscale
ρ(τ)

Look at autocorrelation for small time lag τ


Expand about zero
τ2
ρ ( z ) = ρ (0) + ρ ' (0)τ + ρ ' ' (0) +L
2!
k
know ρ (0) = 1, ρ (τ ) = ρ (−τ )

ρ ' (τ ) = − ρ ' (−τ ), ρ ' (0) = − ρ ' (−0) ⇒ ρ ' (0) = 0


I τ
λτ

Taylor microscale (G.I.Taylor)

τ2
ρ (τ ) ≅ 1 + ρ ' ' (0)
2!
τ2
= 1− 2
λτ ← oscillating parabola

2
λτ2 = −
ρ ' ' (0)

related to curvature at origin

Parallel Computing Lab. Hanyang Univ. 28


Turbulence

What is Taylor
y microscale p
physically?
y y

Look at the autocorrealtion of derivative

du ′(t ) du ′(t ′) d2
= u ′(t )u ′(t ′)
dt dt ′ dtdt ′
d2
= u ′2 ρ (t ′ − t )
dtdt ′

change variables,

ξ = t ′ + t ⎫⎪ 2t ′ = ξ + τ ⎞ t ′ = t ′(ξ ,τ )
⎬ ⎟
τ = t ′ − t ⎪⎭ 2t = ξ + τ ⎟⎠ t = t (ξ ,τ )

∂ ∂ ∂ξ ∂ ∂τ ∂ ∂
= + = +
∂t ′ ∂ξ ∂t ′ ∂τ ∂t ′ ∂ξ ∂τ
∂ ∂ ∂ξ ∂ ∂τ ∂ ∂
= + = −
∂t ∂ξ ∂t ∂τ ∂t ∂ξ ∂τ

∂2 ∂2 ∂2
= 2− 2
∂t∂t ′ ∂ξ ∂τ

∂2 ∂2 ∂2 1 du ′(t ) du ′(t ′)
ρ (t ′ − t ) = ( 2 − 2 ) ρ (τ ) = 2
∂t∂t ′ ∂ξ ∂τ u ′ dt dt ′

Parallel Computing Lab. Hanyang Univ. 29


Turbulence

1 ⎛ du′ ⎞
2
2
hence ρ ′′(0) = − 2 ⎜ ⎟ =− 2
u′ ⎝ dt ⎠ λτ
2u′2
∴ λτ2 =
⎛ du′ ⎞
2

⎜ ⎟
⎝ dt ⎠
Is λτ a time scale for the derivative of the signal?
velocity scale characterizing change in U
du Δu U s
~ ~ = time scale characterizing change in t
dt Δt τ s over which change in U occure

Ans. No because urms ~ u 2 is not the correct velocity scale for change in the derivative.

λτ alone tells you nothing about the time scale of the derivative of the signal
signal.
However, when used in conjunetion with the variance, it measures the mean square fluctuating derivative

⎛ du ′ ⎞ 2u ′2
2

⎜ ⎟ = 2
⎝ dt ⎠ λτ
λτ is approximately (exact for Gaussian Proces)
the distance between zero-crossing of the signal ( average distance between zero-crossings )

Parallel Computing Lab. Hanyang Univ. 30


Turbulence

Spectral
p Analysis
y
The optimal way to represent a stationary random process is by harmonic functions (cosine and sines)
The operation for doing this is called Fourier analysis
Define the Fourier transform of u(t) as


F {u (t )} = u~ ( w) ≡
1
2π ∫
−∞
e −iwt u (t )dt

For F
F.T.
T to exist
exist, the integral must exist
Consider

u (t ) = 1
1 ∞ 1 1 −iwt ∞
F {1} =
2π ∫
−∞
e −iwt ⋅1dt =
2π − iw
e ]−∞ ← nott exist
it

How do we take F.T. of 1?


Notice that
t2
− sin(2πt / T )
1 = lim e 2T 2
or another appropriate function (e.g . )
T →∞ 2πt / T

Appropriate function : one which goes to zero at infinity sufficient rapidly that its F
F.T.
T exists

T − ω 2T
t2 t2 2 2
1− ∞ −


− i ωt
F {1
e }= 2T 2
e e 2T 2
dt = e
23 2π −∞

14243
g (t ) g~ (ω )
T T

Parallel Computing Lab. Hanyang Univ. 31


Turbulence

g~T (ω)
gT (t)
1

T T

t ω

T − ω 2T
t2 2 2
1 ∞ −


− i ωt
F {1} = lim e e 2T 2
dt = lim e = δ (ω )
g . f T →∞ 2π −∞ T →∞ 2π

We replace the function 1 with a generalized function which is the limit of a series of functions
where F.T. exists.

The F.T. of 1 is then defined to be the limit of the F.T. of the series. This limit is said to be the F.T.
of 1 in the sense of generalized function
function.

Parallel Computing Lab. Hanyang Univ. 32


Turbulence

Inverse Fourier Transform



F −1{u~ (ω )} ≡ ∫−∞ eiωt u~ (ω )dω = u (t )

What is the Inverse F.T. of δ (w) ? δ (w) is a generalized function.

T − ω 2T
2 2

δ (ω ) = lim e
T →∞ 2π
ω 2T 2 t2
∞ T − −
F {δ (ω )} = Tlim
−1
→∞ ∫− ∞
e iωt
⋅ e 2
dω = lim e 2T
=1
g. f 2π T →∞

No information is lost by F.T


It is simply represented in frequency domain instead of time
time.

Back to random signals, Let u(t) be a stationary random variable


F.T. of r.v. exist only in the sense of generalized function.

1 ∞
u~ =
g . f 2π ∫ −∞
e −iωt u (t )dt

Note that u~ (ω ) is also a random function of frequency


Inverse Fourier Transform

u (t ) = ∫e
i ωt
u~ (ω )dω
g ⋅ f −∞

Parallel Computing Lab. Hanyang Univ. 33


Turbulence

u(t) F u~ (ω )

t ω
F -1

Nothing obvious gained from transformation into frequency domain except we have decomposed
th stationary
the t ti ti
time series
i iinto
t it
its since
i & cosine
i componentt

Parallel Computing Lab. Hanyang Univ. 34


Turbulence

Autocorrelation
∞ ∞
u (t )u (t ′) = ∫ e iwt u~ ( w)dw ⋅ ∫ e iw′t ′u~ ( w′)dw′
−∞ −∞

Assume u(t) to be pure real (as are all physical signals)


Thus it equals its complex conjugate , i.e. u (t ) = u (t )
*

∞ ∞
u (t )u (t ′) = ∫ e −iwt u~ * ( w)dw ⋅ ∫ e iw′t ′u~ ( w′)dw′
*

−∞ −∞

Therefore

∫∫e u~ * ( w)u~ ( w′)dwdw′


i ( w′t ′ − wt )
u (t )u (t ′) = C (t ′ − t ) =
−∞
τ
R.H.S. appears to depend on both t & t’ and
L.H.S. depends only on τ = t ′ − t

u~ * ( w)u~ ( w′) : function of w & w’ only


- determinate since averaged
- must have appropriate from to ensure stationary process

Then we must have

u~ * ( w)u~ ( w′) = S ( w)δ ( w′ − w) to ensure stationary


alternatively,
⎧⎪S ( w)dw : w = w′0 ⎫⎪ Fourier coefficients are uncorrelated at different
u~ * ( w)u~ ( w′)dwdw′ = ⎨ ⎬←
⎪⎩0 : w ≠ w′ ⎪⎭ f
frequencies
i ffor stationary
t ti random
d signal
i l

Parallel Computing Lab. Hanyang Univ. 35


Turbulence

S (ω ) : (power) spectrum
deterministic function defined by

⎧c(τ ) = ∞ eiωt S (ω )dw


⎪ ∫−∞
⎨ 1 ∞ −iωt
2π ∫−∞
⎪ S (ω ) = e c(τ )dτ

← S(ω) and c(τ ) are a F.T. pair in the ordinary sense

Parallel Computing Lab. Hanyang Univ. 36


Turbulence

Properties
p of the spectrum
p
① F.T. of Autocorrelation
1 ∞
S (ω ) =
2π ∫ −∞
e −iωt c(τ )dτ (≥ 0)

② S(0) is related to integral scale

1 ∞ 1 ∞ u2
S (0) =
2π ∫
−∞
c(τ )dτ =
π ∫0
c(τ )dτ =
π
⋅I

③ Area under spectrum is variance of signal



c(0) = u 2 = ∫ S (ω )dω ← energy,
energy power
−∞
⎛ ⎞
⎜ πS (0) ⎟
⎜ I= ∞ ⎟

⎝ ∫−∞ c(w)dw ⎟⎠
④ The spectrum is a symmetric function
1 ∞
S (−ω ) =
2π ∫
−∞
e −i ( −ω )τ c(τ )dτ
1 −∞
=
2π ∫

e −i ( −ω )( −τ ) c(−τ ) (− dτ ) = S (ω )
123
=c(τ )
u2 ∞ useful to check the validity of spectral analysis
From ③ = ∫ S (ω )dω
2 0 (can get u 2 by other method)

Parallel Computing Lab. Hanyang Univ. 37


Turbulence

R
Recall
ll 1 ∞
u~ ( w) =
g . f 2π ∫
−∞
e −iwt u (t )dt

Let ⎧ ~ 1 T − i ωt
2π ∫−T
⎪⎪ uT (ω ) = e u (t )dt : finite time estimator for u~ (ω )
⎨ ~
⎪ S (ω ) = 2π ⋅ uT (ω )
2

⎪⎩ T : spectral estimator
2T

can show after a partial integration

1 2T ⎡ τ ⎤
ST (ω ) =
2π ∫ − 2T
e −iωt c(τ ) ⎢1 − ⎥dτ
⎣ 2T ⎦
⇒ lim S T (ω ) = S (ω ) ← the estimator is unbiased
T →∞

But, ST (ω ) is not a satisfactory estimator of S(w) no matter how large T is


since
var{ST (ω )}
ε2 =
S (ω )
2
=1 •

• the rms fluctuations are as big as the spectrum

However, it can be used to estimate a smoothed version of S(ω)

Parallel Computing Lab. Hanyang Univ. 38


Turbulence

Cross correlation
u (t )v(t ′) = Cuv (τ ) , τ = t ′ − t

don’t have peak at the origin → C uv (τ ) is not symmetric


∫∫e u~ * ( w) v~ ( w′)dwdw′
− i ( w′t ′ − wt )
C uv (t ′ − t ) = u (t ) v (t ′) =
−∞

as before we must have

u~ * ( w) v~ ( w′) = S uv ( w)δ ( w′ − w)

cross spectrum

b f
before C (τ ) = u (t )u (t + τ ) = F -1{S (ω )}
S (ω ) = F {C (τ )}

u 2 = ∫ S (ω )dω
−∞

now Cuv (τ ) = u (t )v(t + τ ) = F -1{S uv (ω )}


S uv (ω ) = F {Cuv }

u = ∫S (ω )dω
2
uv
−∞

and since Cuv (τ ) ≠ Cuv (−τ )


= Cuv (−τ ) ( v(t )u(t − τ ) = v(t + τ )u(t ))
Suv ( w) = S vu* ( w)

Parallel Computing Lab. Hanyang Univ. 39

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy