0% found this document useful (0 votes)
72 views28 pages

Joint Density

The document discusses joint probability distributions of two or more random variables. It defines joint probability mass functions for discrete random variables and joint probability density functions for continuous random variables. It also discusses marginal distributions, independence of random variables, expectation, and covariance.

Uploaded by

lordbuzz123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views28 pages

Joint Density

The document discusses joint probability distributions of two or more random variables. It defines joint probability mass functions for discrete random variables and joint probability density functions for continuous random variables. It also discusses marginal distributions, independence of random variables, expectation, and covariance.

Uploaded by

lordbuzz123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Joint Distribution, Chapter-5

Dhrubajyoti Mandal
Dept. of Mathematics
BITS Pilani K K Birla Goa Campus, Goa
Functions of more than one Random Variables

In many statistical investigations, one is frequently interested in


studying the relationship between two or more random variables,
such as the relationship between annual income and yearly savings
per family, height and weight of persons, level of pollution and the
rate of respiratory illness in cities etc.

In such situations the random variables have a joint distribution


that allows us to understand the relationship between the variables.
Joint Probability Mass Function

Let X and Y be a two discrete random variables. Suppose X takes


values in A = {x1 , x2 , · · · } and Y takes values in B = {y1 , y2 , · · · }.
Then the ordered pair (X , Y ) take values in A × B.
Definition
The joint probability mass function of X and Y is the function
fXY : A × B → R defined by

fXY (xi , yj ) = P(X = xi and Y = yj )

satisfying the two properties


I 0 ≤ fXY ≤ 1, for all (xi , yj )
PP
I fXY = 1
xi yj
Joint PMF continued

Example
Toss two coins. Let X be the outcome of the first coin and Y be
the outcome of the second coin. Find the joint probability mass
function in each of the following cases.

1. Both the coins are fair.


2. The first coin is fair and the second coin is unfair with
P(Head) = 0.6.
3. Both the coins are unfair. For the first coin P(Head) = 0.6
and for the second coin P(Tail) = 0.3.
Joint PMF continued
Example
In an automobile plant two tasks are performed by robots. The
first entails welding one joint; the second, tightening two bolts. Let
X denote the number of defective weld and Y the number of
improperly tightened bolts produced per car. Past data indicates
that the joint density for (X, Y) is as shown in Table.

Y
0 1 2
0 0.6 0.1 0.04
X 1 0.15 0.08 0.03

I Find the probability that there will be no errors made by


robots?
I Find the probability that there will be no improperly tightened
bolts i. e. P(Y = 0)?
Joint PMF continued

In the above example,


1. describe the event E = ‘Y − X ≥ 10 and find the
corresponding probability?
2. describe the event E = ‘X − Y ≥ 10 and find the
corresponding probability?

What if you take the row totals and the column totals in the table?
Marginal Probability Mass Function
Definition
Let (X, Y) be two dimensional discrete r. v. with joint density
fXY (x , y ). The marginal probability mass function for X denoted
by fX is given by
X
fX (xi ) = fXY (xi , y )
all y

The marginal probability mass function for Y, denoted by fY , is


given by
X
fY (yj ) = fXY (x , y )
all x

Remark: The Row totals of the rectangular table of PMF give the
marginal PMF of X and the column totals give the marginal PMF
of Y . From these marginal PMFs one can find the probability of
any event involving only X or only Y .
Examples

Example
Let X and Y be discrete random variables with joint probability
density function
(
1
(x + y ), if x = 1, 2; y = 1, 2, 3
fXY (x , y ) = 21
0, otherwise.
What are the marginal probability density functions of X and Y?

Example
For what value ( of the constant k the function given by
kxy , if x = 1, 2, 3; y = 1, 2, 3
fXY (x , y ) =
0, otherwise.
is a joint probability density function of some random variables X
and Y?
1
ANS: 36 .
Joint Cumulative Density Function

Suppose X and Y are jointly-distributed random variables. We will


use the notation X ≤ x , Y ≤ y to mean the event
X ≤ x and Y ≤ y . The joint cumulative distribution function
(joint cdf) is defined as

F (x , y ) = P(X ≤ xandY ≤ y )

In the above example, find F (1, 2) and F (2, 2)?


Continous joint density
Definition
Let X and Y be continuous random variables. Then the ordered
pair (X, Y) is called two-dimensional continuous random variables.
A function fXY satisfying
I fXY (x , y ) ≥ 0, −∞ < x < ∞, −∞ < y < ∞
R∞ R∞
−∞ −∞ fXY dydx = 1
I

is called a joint probability density function for (X , Y ).


Then for any A ⊂ R2 we have
Z Z
P[(x , y ) ∈ A] = fXY (x , y )dydx
A

In particular if A = [a, b] × [c, d] then


Z bZ d
P[a ≤ X ≤ b and c ≤ Y ≤ d] = fXY (x , y )dydx
a c
for a, b, c, d real.
Examples

Example
A privately owned business operates both a drive-in facility and
walk-in facility. On a randomly selected day, let X and Y
respectively, be the proportions of the time that the drive-in and
the walk-in facilities are in use, and suppose that the joint density
function of these random variables is
(
2
5 (2x+ 3y ), 0 ≤ x ≤ 1, 0 ≤ y ≤ 1,
fXY (x , y ) =
0, elsewhere

R∞ R∞
I Verify −∞ −∞ fXY (x , y )dydx = 1.
I Find P[(X , Y ) ∈ A], where
A = {(x , y )|0 < x < 21 , 14 < y < 12 }.
Examples

Example
Let the joint(density fuction of X and Y be given by
kxy 2 if 0 < x < y < 1
fXY (x , y ) =
0 otherwise.
What is the value of the constant k?

Example
Let the joint density
( of the continuous random variables X and Y
6 2
(x + 2xy ) if 0 ≤ x ≤ 1; 0 ≤ y ≤ 1
be fXY (x , y ) = 5
0 otherwise.
What is the probability of the event (X ≤ Y )?
ANS: 52 .
Continous marginal densities

Definition
Let (X, Y) be a two dimensional Continous random variable with
joint density fXY . The marginal density for X, denoted by fX is
given by
Z ∞
fX (x ) = fXY (x , y )dy , for − ∞ < x < ∞.
−∞
The marginal density for Y, denoted by fY , is given by
Z ∞
fY (y ) = fXY (x , y )dx , for − ∞ < y < ∞.
−∞
.
Examples

Example
Suppose X and Y both take values in [0,1] with density

f (x , y ) = 4xy .

I Verify whether f (x , y ) is a valid joint pdf?


I Find the probability of the event A = ‘X < 0.5andY > 0.50 .
I Find fX and fY .
Examples

Example
If the joint density
( function for X and Y is given by
3
for 0 < y 2 < x < 1
fXY (x , y ) = 4
0 otherwise.
then what is the marginal density function of X, for 0 < x < 1?

ANS: 32 x .

Example
Let X and Y(have joint density function
2e −x −y for 0 < x ≤ y < ∞
fXY (x , y ) =
0 otherwise.
What is the marginal density of X?
ANS: 2e −2x , 0 < x < ∞.
Independence

Definition
Let X and Y be continuous (discrete) random variables with joint
pdf (pmf) fXY and marginal densities (marginal pmfs) fX and fY ,
respectively. X and Y are independent if and only if

fXY (x , y ) = fX (x )fY (y )

for all x and y .


Examples

Example
Let X and Y(have the joint density
2 2
4xye −(x +y ) x > 0, y > 0
fXY (x , y ) =
0 otherwise.
Are X and Y independent?

Example
Let X and Y(have the joint density
1
(1 + xy ) |x | < 1, |y | < 1
fXY (x , y ) = 4
0 otherwise.
Are X and Y independent?
Expectation

Definition
Let (X , Y ) be a two-dimensional random variable (continuous or
discrete) with joint density (pdf or pmf) fXY . Let H(X , Y ) be a
real valued function of X and Y . Then the expected value of
H(X , Y ), denoted by E [H(X , Y )] is given by
PP
1. E [H(X , Y )] = H(x , y )fXY (x , y ), if it exists, for (X , Y )
x y
discrete;
∞R R∞
2. E [H(X , Y )] = −∞ −∞ H(x , y )fXY (x , y )dydx , if it exists, for
(X , Y ) continuous.
Expectation continued

Question: In the above formula what if we take H(X , Y ) = X and


H(X , Y ) = Y ?

Univariate Averages Found Via the joint Density


PP
1. E [X ] = xfXY (x , y ), if it exists, for (X, Y) discrete.
x y
PP
2. E [Y ] = yfXY (x , y ), if it exists, for (X, Y) discrete.
x y
R∞ R∞
3. E [X ] = −∞ −∞ xfXY (x , y )dxdy , if it exists, for (X, Y)
Continuous.
∞ R R∞
4. E [Y ] = −∞ −∞ yfXY (x , y )dxdy , if it exists, for (X, Y)
Continuous.
Examples

Example
Let X and Y(have the joint density
x + y 0 < x, y < 1
fXY (x , y ) =
0 otherwise.
Find E (X ), E (Y ), and E (XY ).
Covariance
When two random variables are being considered simultaneously, it
is useful to describe how they relate to each other, or how they
vary together. A common measure of the relationship between two
random variables is the covariance. For example, height and
weight of giraffes have positive covariance because when one is big
the other tends also to be big.
Definition
Let X and Y be r. v. with mean µX and µY respectively. The
covariance between X and Y , denoted by Cov(X , Y ) or σXY is
given by
Cov(X , Y ) = E [(X − µX )(Y − µY )]

Computational formula for covariance

Cov(X , Y ) = E [XY ] − E [X ]E [Y ]
Covariance continued

Remarks:
I If small values of X associated with small values of Y and
large values of X with large values of Y , then X − µX and
Y − µY will usually have the same algebraic signs. This
implies (X − µX )(Y − µY ) will be positive, yielding a positive
covariance.
I If the reverse is true and small values of X to be associated
with large values of Y and vice versa, X − µX and Y − µY
will usually have opposite algebraic signs. This indicates the
negative value of (X − µX )(Y − µY ), yielding a negative
covariance.
Examples

Example
Let X and Y(have the joint density
x + y 0 < x, y < 1
fXY (x , y ) =
0 otherwise.
Find Cov (X , Y ).
Covariance continued

Theorem
Let X and Y be two dimensional r. v. with joint density fXY . If X
and Y are independent then

E [XY ] = E [X ]E [Y ]

Corollary:
I If X and Y are independent, then Cov(X , Y ) = 0.
I The converse of the above statement may not be true. That
is we can not conclude that a zero covariance implies
independence.
Properties of Covariance

1. Cov(aX + b, cY + d) = acCov(X , Y ), where a, b, c, d are


constants.
2. Cov(X1 + X2 , Y ) = Cov(X1 , Y ) + Cov(X2 , Y )
3. Cov(X , X ) = Var (X )
4. Var(X + Y ) = Var (X ) + Var (Y ) + 2Cov (X , Y )
Correlation
The units of covariance Cov (X , Y ) are units of X times units of
Y . This makes it hard to compare covariances: if we change scales
then the covariance changes as well.

Cov (aX + b, cY + d) = acCov (X , Y ), a, b, c, d constants

Correlation is a way to remove the scale from the covariance.


Definition
Let X and Y be r. v. with mean µX and µY and variance σX2 and
σY2 respectively. The correlation, ρXY between X and Y is given
by ρXY = √ Cov(X ,Y ) = Cov (X ,Y )
σX σY
Var(X )Var(Y )
Theorem
The correlation coefficient ρXY for any two r. v. X and Y lies
between −1 and 1, inclusive, i. e. −1 ≤ ρXY ≤ 1. and it is a
dimensionless quantity.
Correlation continued

Theorem
Let X and Y be random variables with correlation coefficient ρXY .
Then |ρXY | = 1 if and only if Y = β0 + β1 X for some real number
β0 and β1 6= 0.
Remarks
I If ρ = 1, then we say that X and Y have perfect positive
correlation.
I Perfect positive correlation means that Y = β0 + β1 X , where
β1 > 0.
I This means small values of X are associated with small values
of Y and large values of X are associated with large values of
Y.
Correlation continued

I Perfect negative correlation implies that Y = β0 + β1 X , where


β1 < 0.
I This means small values of X are associated with large values
of Y and vice versa.
I If ρ = 0, we say that X and Y are uncorrelated.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy