Math322 Chapter4
Math322 Chapter4
METHODS
LECTURE SLIDES
CHAPTER 4
MATHEMATICAL EXPECTATION
2
4.1 Mean of a Random
Variable
Exercises
4.1 MEAN OF A RANDOM VARIABLE
If two coins are tossed 16 times and X is the number of heads that occur per toss, then the values of X are
0, 1, and 2. Suppose that the experiment yields no heads, one head, and two heads a total of 4, 7, and 5
times, respectively. The average number of heads per toss of the two coins is then
Mathematical expectation, also known as the expected value, which is the summation of all possible values from a
random variable.
We shall refer to this average value as the mean of the random variable X or the mean ofthe probability
distribution of X and write it as μx or simply as μ when it is clear to which random variable we refer.
DEFINITION 4.1
EXAMPLE:
A coin is biased so that a head is three times as likely to occur as a tail. Find the expected
number of tails when this coin is tossed twice.
EXAMPLE :
Assuming that 1 fair coin was tossed twice, we find that the sample space for our experiment is
S = {HH,HT, TH, TT}.
Since the 4 sample points are all equally likely, it follows that
P(X = 0) = P(TT) =1/4, P(X = 1) = P(TH) + P(HT) =1/2
P(X = 2) = P(HH) =1/4
where a typical element, say TH, indicates that the first toss resulted in a tail followed by a head on the second
toss. Now, these probabilities are just the relative frequencies for the given events in the long run. Therefore,
This result means that a person who tosses 2 coins over and over again will, on the average, get 1
head per toss.
EXAMPLE 4.1:
A lot containing 7 components is sampled by a quality inspector; the lot contains 4
good components and 3 defective components. A sample of 3 is taken by the
inspector. Find the expected value of the number of good components in this sample.
Solution : Let X represent the number of good components in the sample. The
probability distribution of X is
Simple calculations yield f(0) = 1/35, f(1) = 12/35, f(2) = 18/35, and f(3) =4/35.
Therefore
Thus, if a sample of size 3 is selected at random over and over again from a lot of 4 good
components and 3 defective components, it will contain, on average, 1.7 good components.
EXAMPLE 4.2
Let X be the random variable that denotes the life in hours of a certain electronic device.
The probability density function is
Therefore, we can expect this type of device to last, on average, 200 hours.
THEOREM 4.1
EXAMPLE 4.3
Suppose that the number of cars X that pass through a car wash between 4:00 P.M. and 5:00 P.M.
on any sunny Friday has the following probability distribution:
Let g(X) = 2X−1 represent the amount of money, in dollars, paid to the attendant by the manager.
Find the attendant’s expected earnings for this particular time period
Solution : By Theorem 4.1, the attendant can expect to receive
EXAMPLE 4.4
Solution : We have
Note that if g(X, Y ) = X in Definition 4.2, we have
where g(x) is the marginal distribution of X. Therefore, in calculating E(X) over a two-
dimensional space, one may use either the joint probability distribution of X and Y or
the marginal distribution of X. Similarly, we define
The quantity x−μ in Definition 4.3 is called the deviation of an observation from its mean. Since the deviations are
squared and then averaged,σ2 will be much smaller for a set of x values that are close to μ than it will be for a set of
values that vary considerably from μ.
THEOREM 4.2
THEOREM 4.3
EXAMPLE 4.9:
Let the random variable X represent the number of defective parts for a
machine when 3 parts are sampled from a production line and tested. The
following is the probability distribution of X.
First, we compute
μ = (0)(0.51) + (1)(0.38) + (2)(0.10) + (3)(0.01) = 0.61.
Now,.
Calculate the variance of g(X) = 2X + 3, where X is a random variable with probability distribution
Solution : First, we find the mean of the random variable 2X +3. According to Theorem 4.1
Note that the covariance only describes the linear relationship between two random variables.
Therefore, if a covariance between X and Y is zero, X and Y may have a nonlinear relationship, which
means that they are not necessarily independent.
THEOREM 4.4
Theorem 4.5. If X and Y are independent, then Cov(X ,Y) = 0 . The converse is not always
true.
EXAMPLE 4.14:
The fraction X of male runners and the fraction Y of female runners who compete in marathon races are
described by the joint density function
It should be clear to the reader that ρXY is free of the units of X and Y. The correlation coefficient satisfies the
inequality −1 ≤ ρXY ≤ 1. It assumes a value of zero when σXY = 0. Where there is an exact linear dependency, say
Y ≡ a + bX, ρXY = 1 if b > 0 and ρXY = −1 if b < 0.
EXAMPLE :
Suppose that X and Y are independent random variables having the joint probability
distribution
By the above theorem E( XY) = E( XY) – E( X) E (Y ). To compute covariance, we need to find E(XY) , E(X )
and E(Y) .
SOLUTION:
EXAMPLE :
Find the covariance of the random variables X and Y having the joint
probability density
Solution:
4.3 MEANS AND VARIANCES OF LINEAR COMBINATIONS OF RANDOM
VARIABLES
EXAMPLE:
The weekly demand for a certain drink, in thousands of liters, at a chain of convenience
stores is a continuous random variable 𝑔 𝑋 = 𝑋 2 + 𝑋 − 2, where X has the density
function
Find the expected value of the weekly demand for the drink.
Solution:
EXAMPLE :
If X and Y are random variables with variances 𝜎𝑋2 =2 and 𝜎𝑌2 =4 and covariance 𝜎𝑋𝑌 =-2 ,
find the variance of the random variables Z = 3X - 4Y + 8 .
EXAMPLE:
Let X and Y denote the amounts of two different types of impurities in a batch of a
certain chemical product. Suppose that X and Y are independent random variables with
variances σ𝑋2 = 5 and σ2Y = 3. Find the variance of the randomvariable Z = -2X +4Y-3.