3-Mathematical Expectation
3-Mathematical Expectation
3-Mathematical Expectation
TeacherµQiang Ma(êr)
maqiang809@scu.edu.cn
Mathematical Expectation
Contents
Definition 1
Let X be a random variable with probability distribution f (x). The
mean, or expected value of X is
X
µ = E(X) = xf (x)
x
if X is discrete, and
Z ∞
µ = E(X) = xf (x)dx
−∞
if X is continuous.
Mathematical Expectation
Mean of a Random variables
Example 2
A lot containing 7 components is sampled by a quality inspector; the lot
contains 4 good components and 3 defective components. A sample of 3
is taken by the inspector. Find the expected value of the number of good
components in this sample.
Mathematical Expectation
Mean of a Random variables
Example 2
A lot containing 7 components is sampled by a quality inspector; the lot
contains 4 good components and 3 defective components. A sample of 3
is taken by the inspector. Find the expected value of the number of good
components in this sample.
x 0 1 2 3
1 12 18 4
P(X = x) 35 35 35 35
Example 3
Let X be the random variable that denotes the life in hours of a certain
electronic device. The probability density function is
( 20000
, x > 100,
f (x) = x3
0, elsewhere.
Example 3
Let X be the random variable that denotes the life in hours of a certain
electronic device. The probability density function is
( 20000
, x > 100,
f (x) = x3
0, elsewhere.
Theorem 4
Let X be a random variable with probability distribution f (x). The
expected value of the random variable g(X) is
X
µg(X) = E(g(X)) = g(x)f (x)
if X is discrete, and
Z ∞
µg(X) = E(g(X)) = g(x)f (x)dx
−∞
if X is continuous.
Mathematical Expectation
Mean of a Random variables
Example 5
Suppose that the number of cars X that pass through a car wash between
4 : 00 P.M. and 5 : 00 P.M. on any sunny Friday has the following
probability distribution:
x 4 5 6 7 8 9
1 1 1 1 1 1
P (X = x) 12 12 4 4 6 6
Example 5
Suppose that the number of cars X that pass through a car wash between
4 : 00 P.M. and 5 : 00 P.M. on any sunny Friday has the following
probability distribution:
x 4 5 6 7 8 9
1 1 1 1 1 1
P (X = x) 12 12 4 4 6 6
Definition 6
Let X and Y be random variables with joint probability distribution
f (x, y). The mean, or expected value, of the random variable g(X, Y ) is
X
µg(X,Y ) = E(g(X, Y )) = g(x, y)f (x, y)
Example 7
Find E(Y /X) for the density function
(
x(1+3y 2 )
, 0 < x < 2, 0 < y < 1,
f (x, y) = 4
0 elsewhere
Mathematical Expectation
Variance and Covariance of Random Variables
Example 9
Let the random variable X represent the number of defective parts for a
machine when 3 parts are sampled from a production line and tested.
The following is the probability distribution of X.
x 0 1 2 3
f (x) 0.51 0.38 0.10 0.01
Calculate σ 2 .
Mathematical Expectation
Variance and Covariance of Random Variables
Theorem 10
The variance of a random variable X is
σ 2 = E(X 2 ) − µ2 .
Mathematical Expectation
Variance and Covariance of Random Variables
Theorem 10
The variance of a random variable X is
σ 2 = E(X 2 ) − µ2 .
For the continuous case the proof is step by step the same, with
summations replaced by integrations.
Mathematical Expectation
Variance and Covariance of Random Variables
Example 11
The weekly demand for a drinking-water product, in thousands of liters,
from a local chain of efficiency stores is a continuous random variable X
having the probability density
2(x − 1), 1 < x < 2
f (x) = ,
0, elsewhere.
Example 11
The weekly demand for a drinking-water product, in thousands of liters,
from a local chain of efficiency stores is a continuous random variable X
having the probability density
2(x − 1), 1 < x < 2
f (x) = ,
0, elsewhere.
Theorem 12
Let X be a random variable with probability distribution f (x). The
variance of the random variable g(X) is
X
2
σg(X) = E[g(X) − µg(X) ]2 = [g(x) − µg(X) ]2 f (x)
x
if X is discrete, and
Z ∞
2
σg(X) = E[g(X) − µg(X) ]2 = [g(x) − µg(X) ]2 f (x)dx
−∞
if X is continuous.
Mathematical Expectation
Variance and Covariance of Random Variables
Definition 13
Let X and Y be random variables with joint probability distribution
f (x, y). The covariance of X and Y is
XX
σXY = E[(X − µX )(Y − uY )] = (x − µX )(y − µy )f (x, y)
x y
Theorem 14
The covariance of two random variables X and Y with means µX and
µY , respectively, is given by
σXY = E(XY ) − µX µY .
Mathematical Expectation
Variance and Covariance of Random Variables
Theorem 14
The covariance of two random variables X and Y with means µX and
µY , respectively, is given by
σXY = E(XY ) − µX µY .
Example 15
The fraction X of male runners and the fraction Y of female runners who
compete in marathon races are described by the joint density function
8xy, 0 ≤ y ≤ x ≤ 1,
f (x, y) =
0, elsewhere.
Example 15
The fraction X of male runners and the fraction Y of female runners who
compete in marathon races are described by the joint density function
8xy, 0 ≤ y ≤ x ≤ 1,
f (x, y) =
0, elsewhere.
Example 15
The fraction X of male runners and the fraction Y of female runners who
compete in marathon races are described by the joint density function
8xy, 0 ≤ y ≤ x ≤ 1,
f (x, y) =
0, elsewhere.
Then 4 4 8 4
σXY = E(XY ) − µX µY = − = .
9 5 15 225
Mathematical Expectation
Variance and Covariance of Random Variables
Definition 16
Let X and Y be random variables with covariance σXY and standard
deviations σX and σY , respectively. The correlation coefficient of X and
Y is
σXY
ρXY = .
σX σY
Example 17
Find the correlation coefficient of X and Y in Example 15.
Solution: Because
Z 1 Z 1
2 2 1
E(X 2 ) = 4x5 dx = , E(Y 2 ) = 4y 3 (1 − y 2 )dy = 1 − = ,
0 3 0 3 3
we conclude that
2 2 4 2 2 1 8 11
σX = − ( )2 = , σ = − ( )2 = .
4 5 75 Y 3 15 225
Hence,
4/225 4
ρXY = p p =√ .
2/75 11/225 66
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables
We now develop some useful properties that will simplify the calculations
of means and variances of random variables.
Theorem 18
If a and b are constants, then
E(aX + b) = aE(X) + b.
Proof:
Z ∞ Z ∞ Z ∞
E(aX+b) = (ax+b)f (x)dx = a xf (x)dx+b f (x)dx = aE(X)+b.
−∞ −∞ −∞
Corollaries
E(b) = b
E(aX) = aE(X)
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables
Theorem 19
The expected value of the sum or difference of two or more functions of a
random variable X is the sum or difference of the expected values of the
functions. That is,
E[g(X) ± h(X)] = E[g(X)] ± E[h(X)].
Theorem 20
The expected value of the sum or difference of two or more functions of
the random variables X and Y is the sum or difference of the expected
values of the functions. That is,
E[g(X, Y ) ± h(X, Y )] = E[g(X, Y )] ± E[h(X, Y )].
Corollaries:
g(X, Y ) = g(X) and h(X, Y ) = h(Y ), we have
E[g(X) ± h(Y )] = E[g(X)] ± E[h(Y )].
g(X, Y ) = X and h(X, Y ) = Y , we have E[X ± Y ] = E[X] ± E[Y ].
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables
Theorem 21
Let X and Y be two independent random variables. Then
E(XY ) = E(X)E(Y ).
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables
Theorem 21
Let X and Y be two independent random variables. Then
E(XY ) = E(X)E(Y ).
Proof: Z ∞ Z ∞
E(XY ) = xyf (x, y)dxdy.
−∞ −∞
f (x, y) = g(x)h(y),
where g(x) and h(y) are the marginal distributions of X and Y ,
respectively. Hence,
Z ∞ Z ∞ Z ∞ Z ∞
E(XY ) = xyg(x)h(y)dxdy = xg(x)dx yh(y)dy = E(X)E(Y ).
−∞ −∞ −∞ −∞
Corollary
Let X and Y be two independent random variables. Then σXY = 0.
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables
Theorem 22
If X and Y are random variables with joint probability distribution
f (x, y) and a, b, and c are constants, then
2 2 2 2 2
σaX+bY +c = a σX + b σY + 2abσXY .
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables
Theorem 22
If X and Y are random variables with joint probability distribution
f (x, y) and a, b, and c are constants, then
2 2 2 2 2
σaX+bY +c = a σX + b σY + 2abσXY .
2 2
Proof: By definition, σaX+bY +c = E[(aX + bY + c) − µaX+bY +c ] .
Now
Therefore,
2 2
σaX+bY +c =E[a(X − µX ) + b(Y − µY )]
=a2 E[(X − µX )2 ] + b2 E[(Y − µY )2 ] + 2abE[(X − µX )(Y − µY )]
=a2 σX
2
+ b2 σY2 + 2abσXY .
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables
Corollaries:
2
σaX+c = a2 σX
2
= a2 σ 2 .
2 2
σX+c = σX = σ2 .
The variance is unchanged if a constant is added to or subtracted from a
random variable. The addition or subtraction of a constant simply shifts
the values of X to the right or to the left but does not change their
variability.
2
σaX = a2 σX
2
= a2 σ 2 .
If a random variable is multiplied or divided by a constant, then state
that the variance is multiplied or divided by the square of the constant.
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables