Joint Density
Joint Density
Dhrubajyoti Mandal
Dept. of Mathematics
BITS Pilani K K Birla Goa Campus, Goa
Functions of more than one Random Variables
Example
Toss two coins. Let X be the outcome of the first coin and Y be
the outcome of the second coin. Find the joint probability mass
function in each of the following cases.
Y
0 1 2
0 0.6 0.1 0.04
X 1 0.15 0.08 0.03
What if you take the row totals and the column totals in the table?
Marginal Probability Mass Function
Definition
Let (X, Y) be two dimensional discrete r. v. with joint density
fXY (x , y ). The marginal probability mass function for X denoted
by fX is given by
X
fX (xi ) = fXY (xi , y )
all y
Remark: The Row totals of the rectangular table of PMF give the
marginal PMF of X and the column totals give the marginal PMF
of Y . From these marginal PMFs one can find the probability of
any event involving only X or only Y .
Examples
Example
Let X and Y be discrete random variables with joint probability
density function
(
1
(x + y ), if x = 1, 2; y = 1, 2, 3
fXY (x , y ) = 21
0, otherwise.
What are the marginal probability density functions of X and Y?
Example
For what value ( of the constant k the function given by
kxy , if x = 1, 2, 3; y = 1, 2, 3
fXY (x , y ) =
0, otherwise.
is a joint probability density function of some random variables X
and Y?
1
ANS: 36 .
Joint Cumulative Density Function
F (x , y ) = P(X ≤ xandY ≤ y )
Example
A privately owned business operates both a drive-in facility and
walk-in facility. On a randomly selected day, let X and Y
respectively, be the proportions of the time that the drive-in and
the walk-in facilities are in use, and suppose that the joint density
function of these random variables is
(
2
5 (2x+ 3y ), 0 ≤ x ≤ 1, 0 ≤ y ≤ 1,
fXY (x , y ) =
0, elsewhere
R∞ R∞
I Verify −∞ −∞ fXY (x , y )dydx = 1.
I Find P[(X , Y ) ∈ A], where
A = {(x , y )|0 < x < 21 , 14 < y < 12 }.
Examples
Example
Let the joint(density fuction of X and Y be given by
kxy 2 if 0 < x < y < 1
fXY (x , y ) =
0 otherwise.
What is the value of the constant k?
Example
Let the joint density
( of the continuous random variables X and Y
6 2
(x + 2xy ) if 0 ≤ x ≤ 1; 0 ≤ y ≤ 1
be fXY (x , y ) = 5
0 otherwise.
What is the probability of the event (X ≤ Y )?
ANS: 52 .
Continous marginal densities
Definition
Let (X, Y) be a two dimensional Continous random variable with
joint density fXY . The marginal density for X, denoted by fX is
given by
Z ∞
fX (x ) = fXY (x , y )dy , for − ∞ < x < ∞.
−∞
The marginal density for Y, denoted by fY , is given by
Z ∞
fY (y ) = fXY (x , y )dx , for − ∞ < y < ∞.
−∞
.
Examples
Example
Suppose X and Y both take values in [0,1] with density
f (x , y ) = 4xy .
Example
If the joint density
( function for X and Y is given by
3
for 0 < y 2 < x < 1
fXY (x , y ) = 4
0 otherwise.
then what is the marginal density function of X, for 0 < x < 1?
√
ANS: 32 x .
Example
Let X and Y(have joint density function
2e −x −y for 0 < x ≤ y < ∞
fXY (x , y ) =
0 otherwise.
What is the marginal density of X?
ANS: 2e −2x , 0 < x < ∞.
Independence
Definition
Let X and Y be continuous (discrete) random variables with joint
pdf (pmf) fXY and marginal densities (marginal pmfs) fX and fY ,
respectively. X and Y are independent if and only if
fXY (x , y ) = fX (x )fY (y )
Example
Let X and Y(have the joint density
2 2
4xye −(x +y ) x > 0, y > 0
fXY (x , y ) =
0 otherwise.
Are X and Y independent?
Example
Let X and Y(have the joint density
1
(1 + xy ) |x | < 1, |y | < 1
fXY (x , y ) = 4
0 otherwise.
Are X and Y independent?
Expectation
Definition
Let (X , Y ) be a two-dimensional random variable (continuous or
discrete) with joint density (pdf or pmf) fXY . Let H(X , Y ) be a
real valued function of X and Y . Then the expected value of
H(X , Y ), denoted by E [H(X , Y )] is given by
PP
1. E [H(X , Y )] = H(x , y )fXY (x , y ), if it exists, for (X , Y )
x y
discrete;
∞R R∞
2. E [H(X , Y )] = −∞ −∞ H(x , y )fXY (x , y )dydx , if it exists, for
(X , Y ) continuous.
Expectation continued
Example
Let X and Y(have the joint density
x + y 0 < x, y < 1
fXY (x , y ) =
0 otherwise.
Find E (X ), E (Y ), and E (XY ).
Covariance
When two random variables are being considered simultaneously, it
is useful to describe how they relate to each other, or how they
vary together. A common measure of the relationship between two
random variables is the covariance. For example, height and
weight of giraffes have positive covariance because when one is big
the other tends also to be big.
Definition
Let X and Y be r. v. with mean µX and µY respectively. The
covariance between X and Y , denoted by Cov(X , Y ) or σXY is
given by
Cov(X , Y ) = E [(X − µX )(Y − µY )]
Cov(X , Y ) = E [XY ] − E [X ]E [Y ]
Covariance continued
Remarks:
I If small values of X associated with small values of Y and
large values of X with large values of Y , then X − µX and
Y − µY will usually have the same algebraic signs. This
implies (X − µX )(Y − µY ) will be positive, yielding a positive
covariance.
I If the reverse is true and small values of X to be associated
with large values of Y and vice versa, X − µX and Y − µY
will usually have opposite algebraic signs. This indicates the
negative value of (X − µX )(Y − µY ), yielding a negative
covariance.
Examples
Example
Let X and Y(have the joint density
x + y 0 < x, y < 1
fXY (x , y ) =
0 otherwise.
Find Cov (X , Y ).
Covariance continued
Theorem
Let X and Y be two dimensional r. v. with joint density fXY . If X
and Y are independent then
E [XY ] = E [X ]E [Y ]
Corollary:
I If X and Y are independent, then Cov(X , Y ) = 0.
I The converse of the above statement may not be true. That
is we can not conclude that a zero covariance implies
independence.
Properties of Covariance
Theorem
Let X and Y be random variables with correlation coefficient ρXY .
Then |ρXY | = 1 if and only if Y = β0 + β1 X for some real number
β0 and β1 6= 0.
Remarks
I If ρ = 1, then we say that X and Y have perfect positive
correlation.
I Perfect positive correlation means that Y = β0 + β1 X , where
β1 > 0.
I This means small values of X are associated with small values
of Y and large values of X are associated with large values of
Y.
Correlation continued