0% found this document useful (0 votes)
37 views38 pages

Math322 Chapter4

Okay, let's solve this step-by-step: * X and Y are independent random variables * Variance of X is σX2 = 5 * Variance of Y is σY2 = 3 * Z = -2X + 4Y - 3 * Using the formula for variance of a linear combination: Var(Z) = (-2)2Var(X) + (4)2Var(Y) = 4(5) + 16(3) = 20 + 48 = 68 Therefore, the variance of the random variable Z = -2X + 4Y - 3 is 68.

Uploaded by

Awab Abdelhadi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views38 pages

Math322 Chapter4

Okay, let's solve this step-by-step: * X and Y are independent random variables * Variance of X is σX2 = 5 * Variance of Y is σY2 = 3 * Z = -2X + 4Y - 3 * Using the formula for variance of a linear combination: Var(Z) = (-2)2Var(X) + (4)2Var(Y) = 4(5) + 16(3) = 20 + 48 = 68 Therefore, the variance of the random variable Z = -2X + 4Y - 3 is 68.

Uploaded by

Awab Abdelhadi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

MATH 322 PROBABILITY AND STATISTICAL

METHODS

LECTURE SLIDES
CHAPTER 4

MATHEMATICAL EXPECTATION

2
4.1 Mean of a Random
Variable

4.2 Variance and


Covariance of
Random Variables

MATHEMATICAL EXPECTATION 4.3 Means and


Variances of Linear
Combinations of
Random Variables

Exercises
4.1 MEAN OF A RANDOM VARIABLE
If two coins are tossed 16 times and X is the number of heads that occur per toss, then the values of X are
0, 1, and 2. Suppose that the experiment yields no heads, one head, and two heads a total of 4, 7, and 5
times, respectively. The average number of heads per toss of the two coins is then

Mathematical expectation, also known as the expected value, which is the summation of all possible values from a
random variable.

We shall refer to this average value as the mean of the random variable X or the mean ofthe probability
distribution of X and write it as μx or simply as μ when it is clear to which random variable we refer.
DEFINITION 4.1
EXAMPLE:
A coin is biased so that a head is three times as likely to occur as a tail. Find the expected
number of tails when this coin is tossed twice.
EXAMPLE :

Assuming that 1 fair coin was tossed twice, we find that the sample space for our experiment is
S = {HH,HT, TH, TT}.
Since the 4 sample points are all equally likely, it follows that
P(X = 0) = P(TT) =1/4, P(X = 1) = P(TH) + P(HT) =1/2
P(X = 2) = P(HH) =1/4
 where a typical element, say TH, indicates that the first toss resulted in a tail followed by a head on the second
toss. Now, these probabilities are just the relative frequencies for the given events in the long run. Therefore,

 This result means that a person who tosses 2 coins over and over again will, on the average, get 1
head per toss.
EXAMPLE 4.1:
A lot containing 7 components is sampled by a quality inspector; the lot contains 4
good components and 3 defective components. A sample of 3 is taken by the
inspector. Find the expected value of the number of good components in this sample.
Solution : Let X represent the number of good components in the sample. The
probability distribution of X is

Simple calculations yield f(0) = 1/35, f(1) = 12/35, f(2) = 18/35, and f(3) =4/35.
Therefore

Thus, if a sample of size 3 is selected at random over and over again from a lot of 4 good
components and 3 defective components, it will contain, on average, 1.7 good components.
EXAMPLE 4.2

Let X be the random variable that denotes the life in hours of a certain electronic device.
The probability density function is

Solution : Using Definition 4.1, we have

Therefore, we can expect this type of device to last, on average, 200 hours.
THEOREM 4.1
EXAMPLE 4.3

Suppose that the number of cars X that pass through a car wash between 4:00 P.M. and 5:00 P.M.
on any sunny Friday has the following probability distribution:

Let g(X) = 2X−1 represent the amount of money, in dollars, paid to the attendant by the manager.
Find the attendant’s expected earnings for this particular time period
Solution : By Theorem 4.1, the attendant can expect to receive
EXAMPLE 4.4

Let X be a random variable with density function

Solution : By Theorem 4.1, we have


DEFINITION 4.2:
EXAMPLE 4.5

Let X and Y be the random variables with joint


probability distribution indicated in Table 3.1 on page
96. Find the expected value of g(X, Y ) = XY . The
table is reprinted here for convenience.

Solution : By Definition 4.2, we write


EXAMPLE 4.6:
Find E(Y/X) for the density function

Solution : We have
 Note that if g(X, Y ) = X in Definition 4.2, we have

 where g(x) is the marginal distribution of X. Therefore, in calculating E(X) over a two-
dimensional space, one may use either the joint probability distribution of X and Y or
the marginal distribution of X. Similarly, we define

 where h(y) is the marginal distribution of the random variable Y .


4.2 VARIANCE AND COVARIANCE OF RANDOM VARIABLES
The mean, or expected value, of a random variable X is of special importance in
statistics because it describes where the probability distribution is centered.
By itself, however, the mean does not give an adequate description of the shape of the
distribution.
We also need to characterize the variability in the distribution. In Figure 4.1, we have the
histograms of two discrete probability distributions that have the same mean, μ = 2, but
differ considerably in variability, or the dispersion of their observations about the mean.

Figure 4.1: Distributions with equal means and unequal dispersions.


DEFINITION 4.3

The quantity x−μ in Definition 4.3 is called the deviation of an observation from its mean. Since the deviations are
squared and then averaged,σ2 will be much smaller for a set of x values that are close to μ than it will be for a set of
values that vary considerably from μ.
THEOREM 4.2
THEOREM 4.3
EXAMPLE 4.9:

Let the random variable X represent the number of defective parts for a
machine when 3 parts are sampled from a production line and tested. The
following is the probability distribution of X.

Using Theorem 4.2, calculate 𝑉𝑎𝑟(𝑋).


SOLUTION :

First, we compute
μ = (0)(0.51) + (1)(0.38) + (2)(0.10) + (3)(0.01) = 0.61.
Now,.

E(X2 ) = (0)(0.51) + (1)(0.38) + (4)(0.10) + (9)(0.01) = 0.87


Therefore,

σ2 = 0.87 − (0.61)2 = 0.4979.


EXAMPLE:
The weekly demand for a drinking-water product, in thousands of liters, from a local
chain of efficiency stores is a continuous random variable X having the probability
density

Find the mean and variance of X.


Solution:
Calculating 𝐸 𝑋 and 𝐸(𝑋 2 ), we have
THEOREM 4.3
EXAMPLE:

Calculate the variance of g(X) = 2X + 3, where X is a random variable with probability distribution

Solution : First, we find the mean of the random variable 2X +3. According to Theorem 4.1

Now, using Theorem 4.3, we have


DEFINITION 4.4
 The covariance between two random variables is a measure of the nature of the association between
the two.
 If large values of X often result in large values of Y or small values of X result in small values of Y ,
positive 𝑋 − 𝜇𝑋 will often result in positive 𝑌 − 𝜇𝑌 and negative 𝑋 − 𝜇𝑋 will often result in negative
𝑌 − 𝜇𝑌 . Thus, the product (𝑋 − 𝜇𝑋 )(𝑌 − 𝜇𝑌 ) will tend to be positive. On the other hand, if large X
values often result in small Y values, the product (𝑋 − 𝜇𝑋 )(𝑌 − 𝜇𝑌 ) will tend to be negative.
 The sign of the covariance indicates whether the relationship between two dependent random variables
is positive or negative.
 When X and Y are statistically independent, it can be shown that the covariance is zero (see Corollary
4.5). The converse, however, is not generally true.
 Two variables may have zero covariance and still not be statistically independent.

 Note that the covariance only describes the linear relationship between two random variables.
Therefore, if a covariance between X and Y is zero, X and Y may have a nonlinear relationship, which
means that they are not necessarily independent.
THEOREM 4.4

Theorem 4.5. If X and Y are independent, then Cov(X ,Y) = 0 . The converse is not always
true.
EXAMPLE 4.14:

The fraction X of male runners and the fraction Y of female runners who compete in marathon races are
described by the joint density function

Find the covariance of X and Y .


Solution :
We first compute the marginal density functions. They are
DEFINITION 4.5

It should be clear to the reader that ρXY is free of the units of X and Y. The correlation coefficient satisfies the
inequality −1 ≤ ρXY ≤ 1. It assumes a value of zero when σXY = 0. Where there is an exact linear dependency, say
Y ≡ a + bX, ρXY = 1 if b > 0 and ρXY = −1 if b < 0.
EXAMPLE :
Suppose that X and Y are independent random variables having the joint probability
distribution

Show that Cov(X,Y) s is zero.

By the above theorem E( XY) = E( XY) – E( X) E (Y ). To compute covariance, we need to find E(XY) , E(X )
and E(Y) .
SOLUTION:
EXAMPLE :

Find the covariance of the random variables X and Y having the joint
probability density

Show that E(XY ) = E(X)E(Y ),

Solution:
4.3 MEANS AND VARIANCES OF LINEAR COMBINATIONS OF RANDOM
VARIABLES
EXAMPLE:

Let X be a random variable with probability distribution as follows:

Find the expected value of 𝑌 = (𝑋 − 1)2 .


Solution:
EXAMPLE:

The weekly demand for a certain drink, in thousands of liters, at a chain of convenience
stores is a continuous random variable 𝑔 𝑋 = 𝑋 2 + 𝑋 − 2, where X has the density
function

Find the expected value of the weekly demand for the drink.
Solution:
EXAMPLE :

If X and Y are random variables with variances 𝜎𝑋2 =2 and 𝜎𝑌2 =4 and covariance 𝜎𝑋𝑌 =-2 ,
find the variance of the random variables Z = 3X - 4Y + 8 .
EXAMPLE:
Let X and Y denote the amounts of two different types of impurities in a batch of a
certain chemical product. Suppose that X and Y are independent random variables with
variances σ𝑋2 = 5 and σ2Y = 3. Find the variance of the randomvariable Z = -2X +4Y-3.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy