0% found this document useful (0 votes)
35 views37 pages

3-Mathematical Expectation

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views37 pages

3-Mathematical Expectation

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

Mathematical Expectation

3-Mathematical Expectation

TeacherµQiang Ma(êr)

maqiang809@scu.edu.cn
Mathematical Expectation

Contents

1 Mean of a Random variables

2 Variance and Covariance of Random Variables

3 Means and Variances of Linear Combinations of Random Variables


Mathematical Expectation
Mean of a Random variables

Definition 1
Let X be a random variable with probability distribution f (x). The
mean, or expected value of X is
X
µ = E(X) = xf (x)
x

if X is discrete, and
Z ∞
µ = E(X) = xf (x)dx
−∞

if X is continuous.
Mathematical Expectation
Mean of a Random variables

Example 2
A lot containing 7 components is sampled by a quality inspector; the lot
contains 4 good components and 3 defective components. A sample of 3
is taken by the inspector. Find the expected value of the number of good
components in this sample.
Mathematical Expectation
Mean of a Random variables

Example 2
A lot containing 7 components is sampled by a quality inspector; the lot
contains 4 good components and 3 defective components. A sample of 3
is taken by the inspector. Find the expected value of the number of good
components in this sample.

Solution: The total number of elements in the samples is 37 = 35.



The event X is the number of good components. Assume that the
number of good components in this sample is x, x = 0, 1, 2, 3, the
inspector will take x good components and 3 − x defective components,
(x)×(3−x)
there are x4 × 3−x
 
3 choices. Then the probability is 4 35 3 and we
have the distribution tabular as

x 0 1 2 3
1 12 18 4
P(X = x) 35 35 35 35

Then the expect value of X is


1 12 18 4 12
µ = E(X) = 0 × +1× +2× +3× = ≈ 1.7
35 35 35 35 7
Mathematical Expectation
Mean of a Random variables

Example 3
Let X be the random variable that denotes the life in hours of a certain
electronic device. The probability density function is
( 20000
, x > 100,
f (x) = x3
0, elsewhere.

Find the expected life of this type of device.


Mathematical Expectation
Mean of a Random variables

Example 3
Let X be the random variable that denotes the life in hours of a certain
electronic device. The probability density function is
( 20000
, x > 100,
f (x) = x3
0, elsewhere.

Find the expected life of this type of device.

Solution: The expected life is


Z ∞ Z ∞
20000
µ = E(X) = xf (x)dx = x 3 dx
−∞ 100 x
Z ∞
20000 20000 ∞
= dx = − = 200.
100 x2 x 100
Mathematical Expectation
Mean of a Random variables

Now let us consider a new random variable g(X), which depends on


X; that is, each value of g(X) is determined by the value of X.
g(X) might be X 2 or 3X − 1, and whenever X assumes the value
x, g(X) assumes the value g(x).

Theorem 4
Let X be a random variable with probability distribution f (x). The
expected value of the random variable g(X) is
X
µg(X) = E(g(X)) = g(x)f (x)

if X is discrete, and
Z ∞
µg(X) = E(g(X)) = g(x)f (x)dx
−∞

if X is continuous.
Mathematical Expectation
Mean of a Random variables

Example 5
Suppose that the number of cars X that pass through a car wash between
4 : 00 P.M. and 5 : 00 P.M. on any sunny Friday has the following
probability distribution:

x 4 5 6 7 8 9
1 1 1 1 1 1
P (X = x) 12 12 4 4 6 6

Let g(X) = 2X − 1 represent the amount of money, in dollars, paid to


the attendant by the manager. Find the attendant’s expected earnings
for this particular time period.
Mathematical Expectation
Mean of a Random variables

Example 5
Suppose that the number of cars X that pass through a car wash between
4 : 00 P.M. and 5 : 00 P.M. on any sunny Friday has the following
probability distribution:

x 4 5 6 7 8 9
1 1 1 1 1 1
P (X = x) 12 12 4 4 6 6

Let g(X) = 2X − 1 represent the amount of money, in dollars, paid to


the attendant by the manager. Find the attendant’s expected earnings
for this particular time period.

Solution: The attendant can expect to receive


9
X
E(g(X)) =E(2X − 1) = (2x − 1)f (x)
x=4
1 1 1 1 1 1
=7 × +9× + 11 × + 13 × + 15 × + 17 × ≈ 12.67.
12 12 4 4 6 6
Mathematical Expectation
Mean of a Random variables

We shall now extend our concept of mathematical expectation to the


case of two random variables X and Y with joint probability distribution
f (x, y).

Definition 6
Let X and Y be random variables with joint probability distribution
f (x, y). The mean, or expected value, of the random variable g(X, Y ) is
X
µg(X,Y ) = E(g(X, Y )) = g(x, y)f (x, y)

if X and Y are discrete, and


Z ∞ Z ∞
µg(X,Y ) = E(g(X, Y )) = g(x, y)f (x, y)dxdy
−∞ −∞

if X and Y are continuous.


Mathematical Expectation
Mean of a Random variables

Example 7
Find E(Y /X) for the density function
(
x(1+3y 2 )
, 0 < x < 2, 0 < y < 1,
f (x, y) = 4
0 elsewhere
Mathematical Expectation
Variance and Covariance of Random Variables

The mean, or expected value, of a random variable X is of special


importance in statistics because it describes where the probability
distribution is centered.
However, the mean does not give an adequate description of the
shape of the distribution.
We also need to characterize the variability in the distribution.

Figure 2.1: Distributions with equal means and unequal dispersions


Mathematical Expectation
Variance and Covariance of Random Variables

The quantity is referred to as the variance of the random variable X or


the variance of the probability distribution of X and is denoted by
2
Var(X), or the symbol σX , or simply by σ 2 when it is clear to which
random variable we refer.
Definition 8
Let X be a random variable with probability distribution f (x) and mean
µ. The variance of X is
X
σ 2 = E[(X − µ)2 ] = (x − µ)2 f (x) if X is discrete
x
Z ∞
σ 2 = E[(X − µ)2 ] = (x − µ)2 f (x)dx if X is continous
−∞

The positive square root of the variance, σ, is called the standard


deviation of X.
Mathematical Expectation
Variance and Covariance of Random Variables

Example 9
Let the random variable X represent the number of defective parts for a
machine when 3 parts are sampled from a production line and tested.
The following is the probability distribution of X.

x 0 1 2 3
f (x) 0.51 0.38 0.10 0.01

Calculate σ 2 .
Mathematical Expectation
Variance and Covariance of Random Variables

Theorem 10
The variance of a random variable X is
σ 2 = E(X 2 ) − µ2 .
Mathematical Expectation
Variance and Covariance of Random Variables

Theorem 10
The variance of a random variable X is
σ 2 = E(X 2 ) − µ2 .

Proof: For the discrete case, we can write


X X
σ2 = (x − µ)2 f (x) = (x2 − 2µx + µ2 )f (x)
x x
X X X
2
= x f (x) − 2µ xf (x) + µ2 f (x).
x x x
P P
Since µ = x xf (x) by definition, and x f (x) = 1 for any discrete
probability distribution, it follows that
X
σ2 = x2 f (x) − µ2 = E(X 2 ) − µ2 .
x

For the continuous case the proof is step by step the same, with
summations replaced by integrations.
Mathematical Expectation
Variance and Covariance of Random Variables

Example 11
The weekly demand for a drinking-water product, in thousands of liters,
from a local chain of efficiency stores is a continuous random variable X
having the probability density

2(x − 1), 1 < x < 2
f (x) = ,
0, elsewhere.

Find the mean and variance of X.


Mathematical Expectation
Variance and Covariance of Random Variables

Example 11
The weekly demand for a drinking-water product, in thousands of liters,
from a local chain of efficiency stores is a continuous random variable X
having the probability density

2(x − 1), 1 < x < 2
f (x) = ,
0, elsewhere.

Find the mean and variance of X.


Solution: Calculating E(X) and E(X 2 ), we have
Z 2
5
µ = E(X) = 2 x(x − 1)dx =
1 3
and Z 2
17
E(X 2 ) = 2 x2 (x − 1)dx = .
1 6
Therefore,
17 5 1
σ2 = − ( )2 = .
6 3 18
Mathematical Expectation
Variance and Covariance of Random Variables

Theorem 12
Let X be a random variable with probability distribution f (x). The
variance of the random variable g(X) is
X
2
σg(X) = E[g(X) − µg(X) ]2 = [g(x) − µg(X) ]2 f (x)
x

if X is discrete, and
Z ∞
2
σg(X) = E[g(X) − µg(X) ]2 = [g(x) − µg(X) ]2 f (x)dx
−∞

if X is continuous.
Mathematical Expectation
Variance and Covariance of Random Variables

If g(X, Y ) = (X − µX )(Y − µY ), where µX = E(X) and µY = E(Y ),


we have the covariance of X and Y , which we denote by σXY or
Cov(X, Y ).

Definition 13
Let X and Y be random variables with joint probability distribution
f (x, y). The covariance of X and Y is
XX
σXY = E[(X − µX )(Y − uY )] = (x − µX )(y − µy )f (x, y)
x y

if X and Y are discrete, and


Z ∞ Z ∞
σXY = E[(X − µX )(Y − µY )] = (x − µX )(y − µy )f (x, y)dxdy
−∞ −∞

if X and Y are continuous.


Mathematical Expectation
Variance and Covariance of Random Variables

The meanings of covariance

The covariance between two random variables is a measure of the


nature of the association between the two.
If large values of X often result in large values of Y or small values
of X result in small values of Y , positive X − µX will often result
in positive Y − µY and negative X − µX will often result in negative
Y − µY . Thus, the product (X − µX )(Y − µY ) will tend to be
positive.
On the other hand, if large X values often result in small Y values,
the product (X − µX )(Y − µY ) will tend to be negative.
The sign of the covariance indicates whether the relationship
between two dependent random variables is positive or negative.
Mathematical Expectation
Variance and Covariance of Random Variables

Theorem 14
The covariance of two random variables X and Y with means µX and
µY , respectively, is given by
σXY = E(XY ) − µX µY .
Mathematical Expectation
Variance and Covariance of Random Variables

Theorem 14
The covariance of two random variables X and Y with means µX and
µY , respectively, is given by
σXY = E(XY ) − µX µY .

Proof: For the discrete case, we can write


XX
σXY = (x − µX )(y − µY )f (x, y)
x y
XX XX
= xyf (x, y) − µX yf (x, y)
x y x y
XX XX
− µY xf (x, y) + µX µY f (x, y).
x y x y
P P P P
SincePµX = y x xf (x, y), µY = x y yf (x, y), and
P
x y f (x, y) = 1 for any joint discrete distribution, it follows that

σXY = E(XY ) − µX µY − µY µX + µX µY = E(XY ) − µX µY .


Mathematical Expectation
Variance and Covariance of Random Variables

Example 15
The fraction X of male runners and the fraction Y of female runners who
compete in marathon races are described by the joint density function

8xy, 0 ≤ y ≤ x ≤ 1,
f (x, y) =
0, elsewhere.

Find the covariance of X and Y .


Mathematical Expectation
Variance and Covariance of Random Variables

Example 15
The fraction X of male runners and the fraction Y of female runners who
compete in marathon races are described by the joint density function

8xy, 0 ≤ y ≤ x ≤ 1,
f (x, y) =
0, elsewhere.

Find the covariance of X and Y .


Solution: The marginal density function g(x) and h(y) are
(
4x3 , 0 ≤ x ≤ 1,

4y(1 − y2), 0 ≤ y ≤ 1,
g(x) = , h(y) =
0, elsewhere, 0, elsewhere.
Mathematical Expectation
Variance and Covariance of Random Variables

Example 15
The fraction X of male runners and the fraction Y of female runners who
compete in marathon races are described by the joint density function

8xy, 0 ≤ y ≤ x ≤ 1,
f (x, y) =
0, elsewhere.

Find the covariance of X and Y .


Solution: The marginal density function g(x) and h(y) are
(
4x3 , 0 ≤ x ≤ 1,

4y(1 − y2), 0 ≤ y ≤ 1,
g(x) = , h(y) =
0, elsewhere, 0, elsewhere.

Then we can compute


Z 1
4 4 Z 1
2 2 8
µX = E(X) = 4x dx = , µY = 4y (1 − y )dy =
0 5 0 15

From the joint density function given above, we have


Z 1Z 1
2 2 4
E(XY ) = 8x y dxdy = .
0 y 9

Then 4 4 8 4
σXY = E(XY ) − µX µY = − = .
9 5 15 225
Mathematical Expectation
Variance and Covariance of Random Variables

Although the covariance between two random variables does provide


information regarding the nature of the relationship, the magnitude
of σXY does not indicate anything regarding the strength of the
relationship, since σXY is not scale-free.
Its magnitude will depend on the units used to measure both X and
Y.
There is a scale-free version of the covariance called the correlation
coefficient that is used widely in statistics.

Definition 16
Let X and Y be random variables with covariance σXY and standard
deviations σX and σY , respectively. The correlation coefficient of X and
Y is
σXY
ρXY = .
σX σY

ρXY is free of the units of X and Y .


The correlation coefficient satisfies the inequality −1 ≤ ρXY ≤ 1.
Mathematical Expectation
Variance and Covariance of Random Variables

Example 17
Find the correlation coefficient of X and Y in Example 15.

Solution: Because
Z 1 Z 1
2 2 1
E(X 2 ) = 4x5 dx = , E(Y 2 ) = 4y 3 (1 − y 2 )dy = 1 − = ,
0 3 0 3 3
we conclude that
2 2 4 2 2 1 8 11
σX = − ( )2 = , σ = − ( )2 = .
4 5 75 Y 3 15 225
Hence,
4/225 4
ρXY = p p =√ .
2/75 11/225 66
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables

We now develop some useful properties that will simplify the calculations
of means and variances of random variables.
Theorem 18
If a and b are constants, then
E(aX + b) = aE(X) + b.
Proof:
Z ∞ Z ∞ Z ∞
E(aX+b) = (ax+b)f (x)dx = a xf (x)dx+b f (x)dx = aE(X)+b.
−∞ −∞ −∞

Corollaries
E(b) = b
E(aX) = aE(X)
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables

Theorem 19
The expected value of the sum or difference of two or more functions of a
random variable X is the sum or difference of the expected values of the
functions. That is,
E[g(X) ± h(X)] = E[g(X)] ± E[h(X)].

Theorem 20
The expected value of the sum or difference of two or more functions of
the random variables X and Y is the sum or difference of the expected
values of the functions. That is,
E[g(X, Y ) ± h(X, Y )] = E[g(X, Y )] ± E[h(X, Y )].

Corollaries:
g(X, Y ) = g(X) and h(X, Y ) = h(Y ), we have
E[g(X) ± h(Y )] = E[g(X)] ± E[h(Y )].
g(X, Y ) = X and h(X, Y ) = Y , we have E[X ± Y ] = E[X] ± E[Y ].
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables

Theorem 21
Let X and Y be two independent random variables. Then
E(XY ) = E(X)E(Y ).
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables

Theorem 21
Let X and Y be two independent random variables. Then
E(XY ) = E(X)E(Y ).

Proof: Z ∞ Z ∞
E(XY ) = xyf (x, y)dxdy.
−∞ −∞

Since X and Y are independent, we may write

f (x, y) = g(x)h(y),
where g(x) and h(y) are the marginal distributions of X and Y ,
respectively. Hence,
Z ∞ Z ∞ Z ∞ Z ∞
E(XY ) = xyg(x)h(y)dxdy = xg(x)dx yh(y)dy = E(X)E(Y ).
−∞ −∞ −∞ −∞

Corollary
Let X and Y be two independent random variables. Then σXY = 0.
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables

Theorem 22
If X and Y are random variables with joint probability distribution
f (x, y) and a, b, and c are constants, then
2 2 2 2 2
σaX+bY +c = a σX + b σY + 2abσXY .
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables

Theorem 22
If X and Y are random variables with joint probability distribution
f (x, y) and a, b, and c are constants, then
2 2 2 2 2
σaX+bY +c = a σX + b σY + 2abσXY .

2 2
Proof: By definition, σaX+bY +c = E[(aX + bY + c) − µaX+bY +c ] .
Now

µaX+bY +c = E(aX + bY + c) = aE(X) + bE(Y ) + c = aµX + bµY + c,

Therefore,
2 2
σaX+bY +c =E[a(X − µX ) + b(Y − µY )]
=a2 E[(X − µX )2 ] + b2 E[(Y − µY )2 ] + 2abE[(X − µX )(Y − µY )]
=a2 σX
2
+ b2 σY2 + 2abσXY .
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables

Corollaries:

2
σaX+c = a2 σX
2
= a2 σ 2 .

2 2
σX+c = σX = σ2 .
The variance is unchanged if a constant is added to or subtracted from a
random variable. The addition or subtraction of a constant simply shifts
the values of X to the right or to the left but does not change their
variability.

2
σaX = a2 σX
2
= a2 σ 2 .
If a random variable is multiplied or divided by a constant, then state
that the variance is multiplied or divided by the square of the constant.
Mathematical Expectation
Means and Variances of Linear Combinations of Random Variables

If X and Y are independent random variables, then


2
σaX±bY = a2 σX
2
+ b2 σY2 .

If X1 , X2 ,· · · , Xn are independent random variables, then


σa21 X1 +a2 X2 +···+an Xn = a21 σX
2
1
+ a22 σX
2
2
+ · · · + a2n σX
2
n
.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy