0% found this document useful (0 votes)
14 views15 pages

Lect 03

Uploaded by

匡政
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views15 pages

Lect 03

Uploaded by

匡政
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

LECTURE 3

Random Variables

3.1 DEFINITION

It is often convenient to represent the outcome of a random experiment by a number.


A random variable (r.v.) is such a representation. To be more precise, let (Ω, F , P) be a
probability space. Then a random variable X : Ω → ℝ is a mapping of the outcome.

PSfrag replacements ω X(ω)

Figure .. Random variable as a mapping.

Example 3.1. Let the random variable X be the number of heads in n coin flips. The
sample space is Ω = {H , T}n , the possible outcomes of n coin flips; then

X ∈ {0, 1, 2, . . . , n}

Example 3.2. Consider packet arrival times t1 , t2 , . . . in the interval (0, T]. The sample
space Ω consists of the empty string (no packet) and all finite length strings of the form
(t1 , t2 , . . . , tn ) such that 0 < t1 ≤ t2 ≤ ⋅ ⋅ ⋅ ≤ tn ≤ T. Define the random variable X to be
the length of the string; then X ∈ {0, 1, 2, 3, . . .}.
Example 3.3. Consider the voltage across a capacitor. The sample space Ω = ℝ. Define
the random variables

X(ω) = ω,

󶀂+1, ω ≥ 0,
Y (ω) = 󶀊
󶀚
−1, otherwise.
2 Random Variables

Example 3.4. Let (Ω, F , P) be a probability space. For a given event A ∈ F , define the
indicator random variable
1, ω ∈ A,
X(ω) = 󶁇
0, otherwise.

We use the notation 1A (ω) or χA (ω) to denote the indicator random variable for A.

Throughout the course, we use uppercase letters, say, X, Y , Z, Φ, Θ, to denote ran-


dom variables, and lowercase letters to denote the values taken by the random variables.
Thus, X(ω) = x means that the random variable X takes on the value x when the outcome
is ω.
As a representation of a random experiment in the probability space (Ω, F , P), the
random variable X can be viewed as an outcome of a random experiment on its own. The
sample space is ℝ and the set of events is the Borel σ-algebra B. An event A ∈ B occurs if
X ∈ A and its probability is
P({ω ∈ Ω : X(ω) ∈ A}),

which is determined by the probability measure P of the underlying random experiment


and the inverse image of A under the mapping X : Ω → ℝ. Thus, (Ω, F , P) induces a
probability space (ℝ, B, P X ), where

P X (A) = P({ω ∈ Ω : X(ω) ∈ A}), A ∈ B.

An implicit assumption here is that for every A ∈ B, the inverse image {ω ∈ Ω : X(ω) ∈ A}
is an event in F . A mapping X(ω) satisfying this condition is called measurable (with
respect to F ) and we will always assume that a given mapping is measurable.
Since we typically deal with multiple random variables on the same probability space,
we will use the notation P{X ∈ A} instead of the more formal notation PX (A) or P({ω ∈
Ω : X(ω) ∈ A}).

PSfrag replacements
A set A

The inverse image of A under X(ω), i.e., {ω : X(ω) ∈ A}


3.2 Cumulative Distribution Function 3

3.2 CUMULATIVE DISTRIBUTION FUNCTION

To determine P{X ∈ A} for any Borel set A, i.e., any set generated by open intervals via
countable unions, intersections, and complements, it suffices to specify P{X ∈ (a, b)} or
P{X ∈ (a, b]} for all −∞ < a < b < ∞. Then the probability of any other Borel set can be
determined by the axioms of probability. Equivalently, it suffices to specify the cumulative
distribution function (cdf) of the random variable X:

FX (x) = P{X ≤ x} = P{X ∈ (−∞, x ]}, x ∈ ℝ.

The cdf of a random variable satisfies the following properties.


. FX (x) is nonnegative, i.e.,
FX (x) ≥ 0, x ∈ ℝ.

. FX (x) is monotonically nondecreasing, i.e.,

FX (a) ≤ FX (b), a < b.

. Limits.

lim F (x) = 0 and lim F (x) = 1.


x→−∞ X x→+∞ X

. FX (x) is right continuous, i.e.,

FX (a+ ) := lim+ FX (x) = FX (a).


x→a

. Probability of a singleton.

P{X = a} = FX (a) − FX (a− ),

where FX (a− ) := limx→a− FX (x).


Throughout, we use the notation X ∼ F(x) means that the random variable X has the
cdf F(x).

FX (x)
PSfrag replacements 1

Figure .. An illustration of a cumulative distribution function (cdf).


4 Random Variables

3.3 PROBABILITY MASS FUNCTION (PMF)

A random
PSfrag variable X is said to be discrete if FX (x) consists only of steps over a countable
replacements
set X as illustrated in Figure ..
F1

1
F2
F3
F4 x

Figure .. The cdf of a discrete random variable.

A discrete random variable X can be completely specified by its probability mass func-
tion (pmf)
p X (x) = P{X = x}, x ∈ X .

The set X is often referred to as the alphabet of X. Clearly, p X (x) ≥ 0, ∑x∈X p X (x) = 1,
and
P(X ∈ A) = 󵠈 p X (x).
x∈A∩X

Throughout, we use the notation X ∼ p(x) to mean that X is a discrete random variable
X with pmf p(x).
We review a few famous discrete random variables.
Bernoulli. X ∼ Bern(p), p ∈ [0, 1], has the pmf

p X (1) = p and p X (0) = 1 − p.

This is the indicator of observing a head from flipping a coin with bias p.
Geometric. X ∼ Geom(p), p ∈ [0, 1], has the pmf

p X (k) = p(1 − p)k−1 , k = 1, 2, 3, . . . .

This is the number of independent coin flips of bias p until the first head.
Binomial. X ∼ Binom(n, p), p ∈ [0, 1], n = 1, 2, . . . , has the pmf

p X (k) = 󶀤 󶀴pk (1 − p)n−k , k = 0, 1, . . . , n.


n
k
This is the number of heads in n independent coin flips of bias p.
Poisson. X ∼ Poisson(λ), λ > 0, has the pmf

p X (k) = k = 0, 1, 2, . . . .
λ k −λ
e ,
k!
3.4 Probability Density Function 5
PSfrag replacements
F1

1
F2
F3
F4 x
[h]

Figure .. The cdf of a continuous random variable.

This is often used to characterize the number of random arrivals in a unit time interval,
the number of random points in a unit area, and so on.
Let X ∼ Binom(n, λ/n). Then, its pmf for a fixed k is

p X (k) = 󶀤 󶀴󶀣 󶀳 󶀣1 − 󶀳
n λ k λ n−k
k n n
n(n − 1) ⋅ ⋅ ⋅ (n − k + 1) λ k
= 󶀣1 − 󶀳 󶀣1 − 󶀳 ,
λ n λ −k
n k k! n n

which converges to the Poisson(λ) pmf (λ k /k!)e −λ as n → ∞. Thus, Poisson(λ) is the


limit of Binom(n, λ/n).
Example .. In a popular lottery called “Powerball,” the winning combination of num-
bers is selected uniformly at random among ,, possible combinations. Suppose
that αn tickets are sold. What is the probability that there is no winner?
Since the number N of winners is a Binom(αn, 1/n) random variable with n = 2.92 ×
108 very large, we can use the Poission approximation to obtain

P{N = 0} = 󶀣1 − 󶀳 → e −α , as n → ∞.
1 αn
n

If α = 1, there is no winner with probability of 37%. If α = 2, this probability decreases to


14%. Thus, even with 600 million tickets sold, there is a significant chance that the lottery
has no winner and rolls over to the next week (with a bigger jackpot).

3.4 PROBABILITY DENSITY FUNCTION

A random variable is said to be continuous if its cdf is continuous as illustrated in Fig-


ure ..
If FX (x) is continuous and differentiable (except possibly over a countable set), then
X can be completely specified by its probability density function (pdf) f X (x) such that

FX (x) = 󵐐 f X (u) du.


x

−∞
6 Random Variables

If FX (x) is differentiable everywhere, then by the definition of derivative


dFX (x)
f X (x) =
dx
F(x + Δx) − F(x)
= lim
Δx→0 Δx
P{x < X ≤ x + Δx}
= lim . (.)
Δx→0 Δx
The pdf of a random variable satisfies the following properties.
. f X (x) is nonnegative, i.e.,
f X (x) ≥ 0, x ∈ ℝ.
. Normalization. ∞
󵐐 f X (x) dx = 1.
−∞

. For any event A ⊂ ℝ,


P{X ∈ A} = 󵐐 f X (x) dx.
x∈A

In particular,

P{a < X ≤ b} = P{a < X < b} = P{a ≤ X < b} = P{a ≤ X ≤ b} = 󵐐 f X (x) dx.
b

Note that f X (x) should not be interpreted as the probability that X = x. In fact, f X (x)
can be greater than . In light of (.), it is f X (x)Δx that can be interpreted as the approx-
imation of the probability P{x < X ≤ x + Δx} for Δx sufficiently small.
Throughout, we use the notation X ∼ f (x) to mean that X is a continuous random
variable with pdf f (x).
We review a few famous continuous random variables.
Uniform. X ∼ Unif[a, b], a < b, has the pdf

x ∈ [a, b],
f X (x) = 󶁇 b−a
1

0 otherwise.
This is often used to model quantization noise.
Exponential. X ∼ Exp(λ), λ > 0, has the pdf

λe −λx x ≥ 0,
f X (x) = 󶁇
0 otherwise.
This is often used to model the service time in a queue or the time between two random
arrivals. An exponential random variable satisfies the memoryless property
P{X > x + t}
P(X > x + t | X > t) = = P{X > x}, t, x > 0.
P{X > t}
3.5 Functions of a Random Variable 7

Example .. Suppose that for every t > 0, the number of packet arrivals during time
interval (0, t] is a Poisson(λt) random variable, i.e.,
(λt)n −λt
pN (n) = e , n = 0, 1, 2, . . . .
n!
Let X be the time until the first packet arrival. Then the event {X > t} is equivalent to the
event {N = 0} and thus

FX (t) = 1 − P{X > t)


= 1 − P{N = 0}
= 1 − e −λt .

Hence, f X (t) = λe −λt and X ∼ Exp(λ).

Gaussian. X ∼ N(μ, σ 2 ) has the pdf


(x−μ)2
f X (x) =
1
󵀂2πσ 2
e 2σ 2

This characterizes many random phenomena such as thermal and shot noise, and is also
called a normal random variable. The cdf of the standard normal random variable N(0, 1)
is
Φ(x) = 󵐐
x 2
1 −u
󵀂2π
e 2 du.
−∞

Its complement is
Q(x) = 1 − Φ(x) = P{X > x}.

The numerical values of the Q function is often used to compute probabilities of any
Gaussian random variable Y ∼ N(μ, σ 2 ) as
y−μ y−μ
P{Y > y} = P󶁄X > 󶁔 = Q󶀤 󶀴. (.)
σ σ

3.5 FUNCTIONS OF A RANDOM VARIABLE

Let X be a random variable and  : ℝ → ℝ be a given function. Then Y = (X) is a


random variable and its probability distribution can be expressed through that of X. For
example, if X is discrete, then Y is discrete and

pY (y) = P{Y = y}
= P{(X) = y}
= 󵠈 p X (x).
x: (x)=y
8 Random Variables

fX

PSfrag replacements
Q(x)

Figure .. The pdf of the standard normal random variable and the Q function.

In general,
FY (y) = P{Y ≤ y}
= P{(X) ≤ y},
which can be further simplified in many cases.
Example . (Linear function). Let X ∼ FX (x) and Y = aX + b, a ̸= 0. If a > 0, then
y−b y−b
FY (y) = P{aX + b ≤ y} = P󶁃X ≤ 󶁓 = FX 󶀣 󶀳.
a a
Taking derivative with respect to y, we have
y−b
fY (y) = f 󶀤 󶀴
1
a X a
We can similarly show that if a < 0, then
y−b −
FY (y) = 1 − FX 󶀤󶀣 󶀳 󶀴
a

y−b
and
fY (y) = − f X 󶀣 󶀳.
1
a a

y
PSfrag replacements
x
y−b
a

Figure .. A linear function.


3.5 Functions of a Random Variable 9

Combining both cases,


y−b
fY (y) = f 󶀤 󶀴
1
|a| X a

As a special case, let X ∼ N(μ, σ 2 ), i.e.,


(x−μ)2
f X (x) =
1
󵀂2πσ 2
e 2σ 2 .

Again setting Y = aX + b, we have

y−b
fY (y) = fX 󶀥 󶀵
1
|a| a


󰑦−󰑏 2
󶀢 −μ󶀲
=
1 1 󰑎

|a| 󵀂2πσ 2
e 2σ 2


(y−b−aμ)2
=
1
󵀄2π(aσ)2
e 2a2 σ 2 .

Therefore, Y ∼ N(aμ + b, a2 σ 2 ). This result justifies the use of the Q function in (.) to
compute probabilities for an arbitrary Gaussian random variable.

Example . (Quadratic function). Let X ∼ FX (x) and Y = X 2 . If y < 0, then FY (y) = 0.
Otherwise,

FY (y) = P 󶁁−󵀂y ≤ X ≤ 󵀂y 󶁑 = FX 󶀡󵀂y󶀱 − FX 󶀡(−󵀂y)− 󶀱

If X is continuous with pdf f X (x), then

fY (y) = 󶀡 f (−󵀂y) + f X (󵀂y)󶀱.


1
2󵀂 y X

PSfrag replacements
x
󵀂y

Figure .. A quadratic function.


10 Random Variables

The above two examples can be generalized as follows.

Proposition .. Let X ∼ f X (x), (x) be differentiable, and Y = (X). Then

f X (xi )
fY (y) = 󵠈
i=1 | (xi )|
󳰀
,

where x1 , x2 , . . . are the solutions of the equation y = (x) and 󳰀 (xi ) is the derivative
of  evaluated at xi .

The distribution of Y can be written explicitly even when  is not differentiable.


Example . (Limiter). Let X be a r.v. with Laplacian pdf f X (x) = 12 e −|x| , and let Y be
defined by the function of X shown in Figure .. Consider the following cases.
∙ If y < −a, clearly FY (y) = 0.
∙ If y = −a,

FY (−a) = FX (−1)
−1
=󵐐 1 x
2
e dx = 12 e −1 .
−∞

∙ If −a < y < a,

FY (y) = P{Y ≤ y}
= P 󶁁aX ≤ y󶁑

= P 󶁄X ≤ 󶁔 = FX 󶀤 󶀴
y y
a a
= 12 e −1 + 󵐐
y/a
1 −|x|
2
e dx.
−1

PSfrag replacements
+a
−1
x
+1
−a

Figure .. The limiter function.


3.6 Application: Generation of Random Variables 11

∙ If y ≥ a, FY (y) = 1.
Combining these cases, the cdf of Y is sketched in Figure ..

FY (y)

PSfrag replacements

y
−a a

Figure .. The cdf of the random variable Y.

3.6 APPLICATION: GENERATION OF RANDOM VARIABLES

Suppose that we are given a uniform random variable X ∼ Unif[0, 1] and wish to gen-
erate a random variable Y with prescribed cdf F(y). If F(y) is continuous and strictly

x = F(y)

PSfrag replacements

y = F −1 (x)

Figure .. Generation of Y ∼ F(y) from a uniform random variable X.

increasing, set
Y = F −1 (X).
Then, since X ∼ Unif[0, 1] and 0 ≤ F(y) ≤ 1,
FY (y) = P{Y ≤ y}
= P{F −1 (X) ≤ y}
= P{X ≤ F(y)} (.)
= F(y).
Thus, Y has the desired cdf F(y). For example, to generate Y ∼ Exp(λ) from X ∼ Unif[0, 1],
we set
1
Y = − ln(1 − X).
λ
12 Random Variables

More generally, for an arbitrary cdf F(y), we define

F −1 (x) := min{y : x ≤ F(y)}, x ∈ (0, 1]. (.)

Since F(y) is right continuous, the above minimum is well-defined. Furthermore, since
F(y) is monotonically nondecreasing, F −1 (x) ≤ y iff x ≤ F(y). We now set Y = F −1 (X) as
before, but under this new definition of “inverse.” It follows immediately that the equality
in (.) continues to hold and that Y ∼ F(y). For example, to generate Y ∼ Bern(p), we
set
Y =󶁇
0 X ≤ 1 − p,
1 otherwise.

In conclusion, we can generate a random variable with any desired distribution from a
Unif[0, 1] random variable.

x = F(y)
PSfrag replacements

1− p

y
0 1

Figure .. Generation of a Bern(p) random variable.

Conversely, a uniform random variable can be generated from any continuous random
variable. Let X be a continuous random variable with cdf F(x) and Y = F(X). Since
F(x) ∈ [0, 1], FY (y) = P{Y ≤ y} = 0 for y < 0 and FY (y) = 1 for y > 1. For y ∈ [0, 1], let
F −1 (y) be defined as in (.). Then

FY (y) = P{Y ≤ y}
= P{F(X) ≤ y}
= P{X ≤ F −1 (y)}
= F(F −1 (y)) (.)
= y,

where the equality in (.) follows by the definition of F −1 (y). Hence, Y ∼ U [0, 1]. For
example, let X ∼ Exp(λ) and

Y =󶁇
1 − exp(−λX) X ≥ 0.
0 otherwise.

Then Y ∼ Unif[0, 1].


Problems 13

The exact generation of a uniform random variable, which requires an infinite num-
ber of bits to describe, is not possible in any digital computer. One can instead use the
following approximation. Let X1 , X2 , . . . Xn be independent and identically distributed
(i.i.d.) Bern(1/2) random variables, and

Y = .X1 X2 . . . Xn

be a fraction in base  that lies between  and . Then Y is a discrete random variable
uniformly distributed over the set {k/2n : k = 0, 1, . . . , 2n − 1} and its cdf F(y) converges
to that of a Unif[0, 1] random variable for every y as n → ∞. Thus, by flipping many fair
coin flips, one can simulate a uniform random variable.
The fairness of coin flips is not essential to this procedure. Suppose that Z1 and Z2
are i.i.d. Bern(p) random variable. The following procedure due to von Neumann can
generate a single Bern(1/2) random variable, even when the bias p is unknown. Let

0 (Z1 , Z2 ) = (0, 1),


X=󶁇
1 (Z1 , Z2 ) = (1, 0).

If (Z1 , Z2 ) = (0, 0) or (1, 1), then the outcome is ignored. Clearly p X (0) = p X (1) = 1/2.
By repeating the same procedure, one can generate a sequence of i.i.d. Bern(1/2) random
variables from a sequence of i.i.d. Bern(p) random variables.

PROBLEMS

.. Probabilities from a cdf. Let X be a random variable with the cdf shown below.
F(x)

2/3

PSfrag replacements
1/3

1 2
3
x
x
1 2 3 4

Find the probabilities of the following events.


(a) {X = 2}.
(b) {X < 2}.
(c) {X = 2} ∪ {0.5 ≤ X ≤ 1.5}.
(d) {X = 2} ∪ {0.5 ≤ X ≤ 3}.
.. Gaussian probabilities. Let X ∼ N(1000, 400). Express the following in terms of
the Q function.
14 Random Variables

(a) P{0 < X < 1020}.


(b) P{X < 1020|X > 960}.
.. Laplacian. Let X ∼ f (x) = 21 e −|x| .
(a) Sketch the cdf of X.
(b) Find P{|X| ≤ 2 or X ≥ 0} .
(c) Find P{|X| + |X − 3| ≤ 3} .
(d) Find P{X ≥ 0 | X ≤ 1} .
.. Distance to the nearest star. Let the random variable N be the number of stars in a
region of space of volume V . Assume that N is a Poisson r.v. with pmf

e −ρV (ρV )n
pN (n) = , for n = 0, 1, 2, . . . ,
n!
where ρ is the “density” of stars in space. We choose an arbitrary point in space
and define the random variable X to be the distance from the chosen point to the
nearest star. Find the pdf of X (in terms of ρ).
.. Time until the n-th arrival. Let the random variable N be the number of packets
arriving during time (0, t]. Suppose that N is Poisson with pmf

(λt)n −λt
pN (n) = e for n = 0, 1, 2, . . . .
n!
Let the random variable Y be the time to get the n-th packet. Find the pdf of Y .
.. Uniform arrival. The arrival time of a professor to his office is uniformly distributed
in the interval between  and  am.
(a) Find the probability that the professor will arrive during the next minute given
that he has not arrived by :.
(b) Repeat for :.
.. Lognormal distribution. Let X ∼ N (0, σ 2 ). Find the pdf of Y = e X (known as the
lognormal pdf).
.. Random phase signal. Let Y (t) = sin(ωt + Θ) be a sinusoidal signal with random
phase Θ ∼ U [−π, π]. Find the pdf of the random variable Y (t) (assume here that
both t and the radial frequency ω are constant). Comment on the dependence of
the pdf of Y (t) on time t.
.. Quantizer. Let X ∼ Exp(λ), i.e., an exponential random variable with parameter λ
and Y = ⌊X⌋, i.e., Y = k for k ≤ X < k + 1, k = 0, 1, 2, . . . .
(a) Find the pmf of Y .
(b) Find the pdf of the quantization error Z = X − Y .
Problems 15

.. Gambling. Alice enters a casino with one unit of capital. She looks at her watch to
generate a uniform random variable U ∼ unif [0, 1], then bets the amount U on a
fair coin flip. Her wealth is thus given by the r.v.

X=󶁆
1 + U , with probability 1/2,
1 − U , with probability 1/2.

Find the cdf of X.


.. Nonlinear processing. Let X ∼ Unif [−1, 1]. Define the random variable

X 2 + 1, if |X| ≥ 0.5
Y =󶁆
0, otherwise.

Find and sketch the cdf of Y .

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy