CH 4
CH 4
Variance
©Fall 2024
1 / 71
Outlines
1 Random Variables
3 Expectation
4 Variance
2 / 71
4.1 Random Variables
3 / 71
4.1 Random Variables
Examples:
(1) The number of students absent in class, which can be 0, 1, 2, · · · .
⇒ The number of absence is a random variable.
(2) Quiz scores in statistical course.
(3) Heights of male/female students.
From examples above, we find that random variables can be discrete
(outcome ∈ N) or continuous (outcome ∈ R).
4 / 71
4.1 Random Variables
Definition
(a) Discrete random variable: a random variable that can assume only
certain separated values.
(b) Continuous random variable: a random variable that assumes an
infinite number of values within a given range.
5 / 71
4.1 Random Variables
6 / 71
4.1 Random Variables
Notation:
If X is a discrete random variable, then the probability mass function
(pmf) is defined as
p(x) ≜ P(X = x)
7 / 71
4.1 Random Variables
8 / 71
4.1 Random Variables
Example: Let X be a random variable that is defined as the sum of two
fair dice. Find the pmf of X .
9 / 71
4.1 Random Variables
Example: Find the value of k which makes P(X = x) below a pmf.
x 0 1 2 3
P(X = x) k 2k 0.3 4k
10 / 71
4.1 Random Variables
11 / 71
4.1 Random Variables
12 / 71
4.1 Random Variables
13 / 71
4.1 Random Variables
Properties of pdf: Suppose B = [a, b] with a < b, and let Ω denote the
support of X . Then
(a) f (x) ≥ 0 for all x ∈ Ω.
(b) P(X ∈ B) = P(a < X < b) > 0.
Ra
(c) P(a < X < a) = a f (x)dx = 0.
(d) P(X ∈ Ω) = 1 and P(X ∈ Ωc ) = 0.
14 / 71
4.1 Random Variables
15 / 71
4.1 Random Variables
kx 2
0<x ≤1
f (x) = k(2 − x) 1 < x ≤ 2
0 otherwise
16 / 71
4.2 Cumulative Distribution Function
17 / 71
4.2 Cumulative Distribution Function
18 / 71
4.2 Cumulative Distribution Function
19 / 71
4.2 Cumulative Distribution Function
Properties of CDF:
(a) F (−∞) = 0 and F (∞) = 1.
(b) F (x) is an increasing function.
(c) F (b) − F (a) = P(a < X ≤ b).
20 / 71
4.2 Cumulative Distribution Function
Example: Suppose that X is a discrete random variable with pmf
P(X = 1) = 12 , P(X = 2) = 31 , P(X = 3) = 16 .
Find F (x) and draw a plot of CDF.
21 / 71
4.2 Cumulative Distribution Function
Example: The range of random variable X is Ω = {1, 2, 3, 4, 5}. For
x ∈ Ω, the CDF for X is given by
x 1 2 3 4 5
F (x) 0.1k 0.2 0.5k k 4k 2
(a) Find k and P(2 < X ≤ 4).
(b) Find the pmf.
22 / 71
4.2 Cumulative Distribution Function
Example: Continue an early example
K (4x − 2x 2 ) 0 < x < 2
f (x) =
0 otherwise
Find F (x) and calculate P(X > 1).
23 / 71
4.3 Expectation
Definition (Expectation)
Let X be a random variable. The expectation or expected value of X is
denoted as µ ≜ E (X ).
In particular,
(a) if X is a discrete random variable with possible outcomes x1 , · · · , xn ,
then
n
X
µ= xi P(X = xi );
i=1
24 / 71
4.3 Expectation
25 / 71
4.3 Expectation
Example: Calculate µ:
(a) X isthe random variable of tossing a fair die.
1 if A occurs
(b) I = is called an indicator function, where
0 if A does not occur
A is an event.
(c) X is a continuous random variable with pdf
1
f (x) = 1.5 if 0 < x < 1.5
0 otherwise
26 / 71
4.3 Expectation
Definition
(a) if X is a discrete random variable with pmf p(X = x), then
X
E {g (X )} = g (x)P(X = x);
x
27 / 71
4.3 Expectation
28 / 71
4.3 Expectation
29 / 71
4.3 Expectation
Calculate E (X 2 ).
30 / 71
4.3 Expectation
Example: The time, in hours, it takes to locate and repair an electrical
breakdown in a certain factory is a random variable, call X , whose pdf is
given by
1 if 0 < x < 1
f (x) =
0 otherwise
If the cost involved in a breakdown of duration x is x 3 , what is the
expected cost of such a breakdown?
31 / 71
4.4 Variance
32 / 71
4.4 Variance
Definition
If X is a random variable with mean µ = E (X ), then the variance of X ,
denoted by var(X ), is defined as
var(X ) = E (X − µ)2 .
p
Moreover, var(X ) is called standard deviation of X .
var(X ) = E (X 2 ) − {E (X )}2 .
33 / 71
4.4 Variance
34 / 71
4.4 Variance
35 / 71
4.4 Variance
1 if A occurs
Example: Recall that I = . Calculate var(X ).
0 otherwise
36 / 71
4.4 Variance
Example: X is a continuous random variable with pdf
1
f (x) = 1.5 if 0 < x < 1.5
0 otherwise
Find var(X ).
37 / 71
4.4 Variance
Property of var(X ):
Suppose that X is a random variable and a, b are nonzero constants.
Q: What is var(aX ± b)?
38 / 71
4.5 Jointly Distributed Random Variables
39 / 71
4.5 Jointly Distributed Random Variables
Definition
When X and Y are both discrete random variable with possible values
x1 , x2 , · · · , and y1 , y2 , · · · ,, respectively, then the joint probability mass
function of X and Y at X = xi and Y = yj is defined as
∀i = 1, · · · , j = 1, · · · .
40 / 71
4.5 Jointly Distributed Random Variables
41 / 71
4.5 Jointly Distributed Random Variables
Example: Suppose that 3 batteries are randomly chosen from a group of
3 new, 4 used but still working, and 5 defective batteries. If we let X and
Y denote, respectively, the number of new and used but still working
batteries that are chosen, then find the joint probability mass function of
X and Y .
42 / 71
4.5 Jointly Distributed Random Variables
Example: Suppose that 15 percent of the families in a certain community
have no children, 20 percent have one, 35 percent have two, and 30
percent have three children; suppose further that each child is equally
likely (and independently) to be a boy or girl. Let B be the number of
boys and let G denote the number of girls. If a family is chosen at random
from this community, then find the joint pmf of B and G .
43 / 71
4.5 Jointly Distributed Random Variables
44 / 71
4.5 Jointly Distributed Random Variables
Definition
Let X and Y denote continuous random variables, and f : R2 → R+ is a
bivariate continuous function. Suppose that C is a set in two-dimensional
plane, then the probability of (X , Y ) in a set C is defined as
ZZ
P (X , Y ) ∈ C = f (x, y )dydx. (3)
(x,y )∈C
45 / 71
4.5 Jointly Distributed Random Variables
46 / 71
4.5 Jointly Distributed Random Variables
and
Z d Z d Z ∞
P(Y ∈ [c, d]) = fY (y )dy = f (x, y )dx dy
c c −∞
47 / 71
4.5 Jointly Distributed Random Variables
Example: The joint density function of X and Y is given by
−x−2y
2e 0 < x < ∞, 0 < y < ∞
f (x, y ) =
0 otherwise
Compute (a) P(X > 1, Y < 1) and (b) P(X < a) for some constant
a > 0.
48 / 71
4.5 Jointly Distributed Random Variables
Independence:
49 / 71
4.5 Jointly Distributed Random Variables
Definition
The random variables X and Y are said to be independent if for any two
sets of real numbers A and B,
50 / 71
4.5 Jointly Distributed Random Variables
51 / 71
4.5 Jointly Distributed Random Variables
52 / 71
4.5 Jointly Distributed Random Variables
Example: Continue an early example that X and Y are two random
variables whose joint density function is given by
−x−2y
2e 0 < x < ∞, 0 < y < ∞
f (x, y ) =
0 otherwise
Check the independence of X and Y .
53 / 71
4.5 Jointly Distributed Random Variables
Conditional probability:
54 / 71
4.5 Jointly Distributed Random Variables
55 / 71
4.5 Jointly Distributed Random Variables
f (x, y )
fX |Y (x|y ) ≜ .
fY (y )
56 / 71
4.5 Jointly Distributed Random Variables
57 / 71
4.5 Jointly Distributed Random Variables
Example: Suppose that p(x, y ), the joint pmf of X and Y , is given by
p(0, 0) = 0.4, p(0, 1) = 0.2, p(1, 0) = 0.1, p(1, 1) = 0.3.
Calculate the conditional pmf of X given Y = 1
58 / 71
4.5 Jointly Distributed Random Variables
Example: The joint pdf of X and Y is given by
12
f (x, y ) = 5 x(2 − x − y ) 0 < x < 1, 0 < y < 1
0 otherwise
Compute the conditional pdf of X given Y = y for y ∈ (0, 1).
59 / 71
4.5 Jointly Distributed Random Variables
Expectation:
Suppose that X and Y are two random variables with joint pmf
p(x, y ) or pdf f (x, y ).
Let g (x, y ) denote an arbitrary function of X and Y . Then the
expected value of g (x, y ) is defined as
PP
E {g (X , Y )} = g (x, y )p(x, y ) (discrete version).
Rx∞ y R ∞
E {g (X , Y )} = −∞ −∞
g (x, y )f (x, y )dydx (continuous version).
60 / 71
4.5 Jointly Distributed Random Variables
In particular, we consider g (x, y ) = x + y . Then we have
E {g (X , Y )} = E (X + Y ) = E (X ) + E (Y ).
⇒ expectation has the property of additivity.
Justification by continuous random variables:
61 / 71
4.5 Jointly Distributed Random Variables
How about E (X − Y )?
In general, the additivity still holds for n (n > 2) random variables.
That is, for X1 , · · · , Xn , we have
n
X
E (X1 + X2 + · · · + Xn ) = E (Xi ).
i=1
62 / 71
4.5 Jointly Distributed Random Variables
Example: A construction firm has recently sent in bids for 3 jobs worth
(in profits) 10, 20, and 40 (thousands) dollars. If its probabilities of
winning the job are respectively 0.2, 0.8, and 0.3, what is the firm’s
expected total profit?
63 / 71
4.5 Jointly Distributed Random Variables
Example: A secretary has typed N letters along with their respective
envelopes. The envelopes get mixed up when they fall on the floor. If the
letters are placed in the mixed-up envelopes in a completely random
manner (that is, each letter is equally likely to end up in any of the
envelopes), what is the expected number of letters that are placed in the
correct envelopes?
64 / 71
4.5 Jointly Distributed Random Variables
65 / 71
4.5 Jointly Distributed Random Variables
Definition (covariance)
Let X and Y be two random variables. Then the covariance of X and Y is
defined as
where µX = E (X ) and µY = E (Y ).
cov(X , Y ) = E (XY ) − E (X )E (Y ).
66 / 71
4.5 Jointly Distributed Random Variables
Properties:
(a) cov(X , Y ) = cov(Y , X ).
(b) cov(X , X ) = var(X ).
(c) cov(aX , Y ) = acov(X , Y ) for a ̸= 0.
(d) for three random variables X , Y , Z ,
cov(X + Z , Y ) = cov(X , Y ) + cov(Z , Y ).
Pn m
P P n Pm
(e) In general, cov Xi , Yj = cov(Xi , Yj ).
i=1 j=1 i=1 j=1
67 / 71
4.5 Jointly Distributed Random Variables
68 / 71
4.5 Jointly Distributed Random Variables
Theorem
If X and Y are independent random variables, then cov(X , Y ) = 0, and
thus, var(X + Y ) = var(X ) + var(Y ).
69 / 71
4.5 Jointly Distributed Random Variables
70 / 71
4.5 Jointly Distributed Random Variables
71 / 71