Distributions
Distributions
The idea of these notes is to complement Section 1.3 in the notes regard-
ing Fourier Transform. The material presented here was taking from the
book of Jeffrey Rauch, Partial Differential Equations, Graduate Texts in
Mathematics, Springer-Verlag (see [1]).
1. Distributions
The distribution theory arises in several contexts. One is the treatment
of impulsive forces. Newton’s second law affirms that the rate of change of
dp
momentum is equal to the force applied, = F . Consider an intense force
dt
which acts over a very short interval of time t0 < t < t0 + ∆t. An example
is theR force applied by the strike of a hammer. The impulse, I, is defined as
I := F (t) dt thus
p(t0 + ∆t) = p(t0 ) + I.
In the limit, as ∆t tends to zero, one arrives to an idealized force which acts
instantaneously to produce a jump I in the momentum p. Formally, the
force law satisfies
Z
(1.1) F = 0 for t 6= 0 and F (t) dt = I.
This idealized force is denoted Iδt0 , and δt0 is called Dirac’s delta function
though no function can satisfy (1.1). The idealized equation of motion is
dp
= δt0 . The solution satisfies p(t+) − p(t−) = I. Such idealizations have
dt
shown to be useful in a variety of problems of mechanics and electricity.
The mathematical framework was developed by Lawrence Schwartz in the
1940’s.
We introduce some notation next. Let Ω ⊂ Rn an open subset. The set
of all infinitely differentiable functions with compact support C0∞ (Ω) will be
denoted by D(Ω) and C ∞ (Ω) the set of all infinitely differentiable functions
on Ω will be denoted by E(Ω). These sets of functions are referred as test
functions.
In this sense, the distributions are generalizations of functions and are some-
times called generalized functions. Two locally integrable functions define
the same distribution if and only if the functions are equal almost every-
where. We say that a distribution l is a locally integrable function and write
l ∈ L1loc (Ω) if l = lf for a f ∈ L1loc (Ω). Similarly, we say that l is continuous
(resp. C ∞ (Ω)) if l = lf , for a f ∈ C(Ω) (resp. C ∞ (Ω)).
Example 1.6. If j ∈ D(Ω) with j(x) dx = 1, let j (x) = −n j(x/). Then
R
j → δ0 .
3
This motivates the definition, hτh l, ϕi = hl, τ−h ϕi. It is easy to check that τh l
defined as above is a distribution and that definition agrees with τh f when
l = lf .
extension of ∂/∂xj on D.
Let us apply the above procedure to find the derivative in distributions
sense of the Heaviside function H(x) = χ[0,∞) (x) defined on R. The differ-
ence quotient
τ−h H − H
= h−1 χ[0,h]
h
converges to δ in the sense of distributions. Thus dH dx = δ. Observe that
the difference quotient converge to zero almost everywhere. Since H is not
constant, zero should not be the desired derivative. The pointwise limit gives
the wrong answer and the distribution derivative is the right answer.
4
Remark 1.10. The proof of the uniqueness of this extension can be seen in
[1] Appendix Proposition 8.
Therefore ϕ ∗ δ = ϕ.
It is not difficult to show that for l ∈ D0 (Rn ),
∂ α (ϕ ∗ l) = ϕ ∗ ∂ α l = (∂ α ϕ) ∗ l.
We end this section with the following result whose proof can be see in
[1] Appendix Proposition 3.
Proposition 1.15. If l ∈ D0 (Rn ) and ϕ ∈ D(Rn ), then l ∗ ϕ is equal to the
C ∞ function whose value at x is hl, τx (ϕ)i.
e
Proposition 1.24. For any f ∈ S0 (Rn ) there exists exactly one solution
u ∈ S0 (Rn ) to (1.10). The solution is given by formula (1.11). In particular,
if f ∈ S, then u ∈ S. If f ∈ L2 , then for all |α| ≤ 2, Dα u ∈ L2 (Rn ).
The second application is a Liouville-type theorem. More precisely.
Theorem 1.25 (Generalized Liouville Theorem). Suppose that P (D) is a
constant coefficient partial differential operator such that P (ξ) 6= 0 for ξ 6= 0.
If u ∈ S0 (Rn ) satisfies P u = 0, then u is a polynomial in x.
Proof. Taking Fourier transform of the equation we obtain
F(P (D))u = P (ξ)b
u = 0.
Since P (ξ) 6= 0 if ξ 6= 0 it follows that supp u
b ⊂ {0}.
Thus Fu has to be a finite linear combination of derivatives of the delta
function X
u
b= cα Dα δ.
Applying the inverse Fourier transform we get
X X X
u= cα FDα δ = cα (−x)α Fδ = cα (−x)α (2π)−n/2 ,
a polynomial in x.
Corollary 1.26. The only bounded harmonic (resp. holomorphic) functions
on Rn (resp. C) are the constants.
8
∂ϕ
since is periodic. Therefore this term is always 0.
∂θ
On the other hand we have
Z ∞ Z ∞
2 ∂ ∂
(1.15) log r ϕ(r, θ) dr = − (log r2 )ϕ(r, θ) dr − log(2 )ϕ(, θ).
∂r ∂r
and
∞ ∞
∂2
Z Z
∂ ∂
r log r2 ϕ(r, θ) dr = − (r log r2 ) ϕ(r, θ) dr
∂r2 ∂r ∂r
∂
− log(2 ) ϕ(, θ)
∂r
Z ∞ 2
(1.16) ∂
= 2
(r log r2 )ϕ(r, θ) dr
∂r
∂
− log(2 ) ϕ(, θ)
∂r
∂
+ (r log r2 ) ϕ(, θ).
∂r
∂ ∂2 2 ∂ 2
Now (r log r2 ) = log r2 + 2, (r log r2 ) = and (log r2 ) =
∂r ∂r2 r ∂r r
Gathering together the information in (1.14), (1.15) and (1.16) we obtain
Z 2π Z ∞ ∂2 1 ∂ 1 ∂2
log r2 + + ϕ(r, θ) rdrdθ
0 ∂r2 r ∂r r2 ∂θ2
Z 2π Z ∞
2 2
= − + ϕ(r, θ) drdθ
0 r r
Z 2π
+ (− log 2 + log 2 + 2)ϕ(, θ) dθ
0
Z 2π
∂ϕ
+ (− log 2 ) (, θ) dθ.
0 ∂r
10
Thus
Z 2π Z 2π
∂ϕ
(1.17) h∆u, ϕi = lim 2 ϕ(, θ) dθ − log 2 (, θ) dθ.
→0 0 0 ∂r
Since ϕ is continuous ϕ(, θ) → ϕ(0, θ) as → 0 and so the first term in
(1.17) approaches to 4πhδ, ϕi.
∂ϕ
In the second term in (1.17), (, θ) remains bounded while log 2 → 0
∂r
as → 0. Hence
∆ log(x2 + y 2 ) = 4πδ.
Therefore log(x2 + y 2 ) is not a weak solution of ∆u = 0.
The previous computations allow us to solve the Poisson equation
(1.18) ∆u = f for any f.
A final remark.
Remark 1.27. It is clear that S0 (Rn ) ⊂ D0 (Rn ). What is not true is that any
distribution in D0 (Rn ) is a tempered distribution. For example the function
2
f (x) = ex in R defines the distribution
Z ∞
2
hf, ϕi = ex ϕ(x) dx.
−∞
2
Observe that e−x /2 ∈ S(R) and so we have
Z ∞ Z ∞
x2 −x2 /2 2 /2
hf, ϕi = e e dx = ex = +∞
−∞ −∞
which does not define a tempered distribution.
Therefore Z
f (x)ϕ(x) dx
Rn
defines a tempered distribution.
References
[1] J. Rauch, Partial Differential Equations, Graduate Texts in Mathematics, 128.
Springer-Verlag, New York, 1991. x+263 pp.