0% found this document useful (0 votes)
33 views

PSLecture11 2022

The document discusses joint distributions of random variables. It defines the joint probability mass function (pmf) for discrete random variables and the joint probability density function (pdf) for continuous random variables. The joint pmf/pdf describes the probability of two random variables taking on particular values simultaneously. It also discusses concepts such as marginal pmf/pdf, normalization properties, and the joint normal/Gaussian distribution.

Uploaded by

Krish
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views

PSLecture11 2022

The document discusses joint distributions of random variables. It defines the joint probability mass function (pmf) for discrete random variables and the joint probability density function (pdf) for continuous random variables. The joint pmf/pdf describes the probability of two random variables taking on particular values simultaneously. It also discusses concepts such as marginal pmf/pdf, normalization properties, and the joint normal/Gaussian distribution.

Uploaded by

Krish
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

Proability and Statistics

MA20205

Bibhas Adhikari

Autumn 2022-23, IIT Kharagpur

Lecture 11
October 10, 2022

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 1 / 19
Joint distributions

□ A single variable X is described by a one-variable pdf f (x)

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 2 / 19
Joint distributions

□ A single variable X is described by a one-variable pdf f (x)


□ A pair of random variables (X , Y ) is described by a two-variable pdf
f (x, y )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 2 / 19
Joint distributions
Let X and Y be two random variables with sample spaces ΩX and ΩY
respectively.
Then the joint random variable is given by (X , Y ) : ΩX × Ωy → R × R

Joint pmf for pair of discrete rvs


Let X and Y be two discrete random variables. The joint pmf of (X , Y ) is
defined as

f (x, y ) = P(X = x and Y = y ) = P((ω, η) : X (ω) = x and Y (ζ) = y )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 3 / 19
Joint distributions
Let X and Y be two random variables with sample spaces ΩX and ΩY
respectively.
Then the joint random variable is given by (X , Y ) : ΩX × Ωy → R × R

Joint pmf for pair of discrete rvs


Let X and Y be two discrete random variables. The joint pmf of (X , Y ) is
defined as

f (x, y ) = P(X = x and Y = y ) = P((ω, η) : X (ω) = x and Y (ζ) = y )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 3 / 19
Joint distributions

Let X be a random variable for a coin toss and Y be draw of a die. The
sample space is ΩX × ΩY = {(ω, η) : ω ∈ {0, 1}, η ∈ {1, 2, 3, 4, 5, 6}}.
Then
1
f (x, y ) = , (x, y ) ∈ S
12
is a pmf corresponding to (X , Y ).

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 4 / 19
Joint distributions

Let X be a random variable for a coin toss and Y be draw of a die. The
sample space is ΩX × ΩY = {(ω, η) : ω ∈ {0, 1}, η ∈ {1, 2, 3, 4, 5, 6}}.
Then
1
f (x, y ) = , (x, y ) ∈ S
12
is a pmf corresponding to (X , Y ).
Questions
1 Let A = {(x, y ) : x + y = 3}. Then P(A) =?

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 4 / 19
Joint distributions

Let X be a random variable for a coin toss and Y be draw of a die. The
sample space is ΩX × ΩY = {(ω, η) : ω ∈ {0, 1}, η ∈ {1, 2, 3, 4, 5, 6}}.
Then
1
f (x, y ) = , (x, y ) ∈ S
12
is a pmf corresponding to (X , Y ).
Questions
1 Let A = {(x, y ) : x + y = 3}. Then P(A) =?
2 Let B = {(x, y ) : min{x, y } = 1}. Then P(B) =?

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 4 / 19
Joint distributions
Joint pdf for continuous rvs
Let X , Y be continuous rvs with sample spaces ΩX , ΩY respectively. Then
joint pdf of (X , Y ) is a function f (x, y ) such that
Z
P(A) = f (x, y )dxdy
A

for any event A ⊂ ΩX × ΩY .

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 5 / 19
Joint distributions
Joint pdf for continuous rvs
Let X , Y be continuous rvs with sample spaces ΩX , ΩY respectively. Then
joint pdf of (X , Y ) is a function f (x, y ) such that
Z
P(A) = f (x, y )dxdy
A

for any event A ⊂ ΩX × ΩY .

For example, if A = [a, b] × [c, d] then


Z d Z b
P(A) = P(a ≤ X ≤ b, c ≤ Y ≤ d) = f (x, y )dxdy
c a

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 5 / 19
Joint distributions
Joint pdf for continuous rvs
Let X , Y be continuous rvs with sample spaces ΩX , ΩY respectively. Then
joint pdf of (X , Y ) is a function f (x, y ) such that
Z
P(A) = f (x, y )dxdy
A

for any event A ⊂ ΩX × ΩY .

For example, if A = [a, b] × [c, d] then


Z d Z b
P(A) = P(a ≤ X ≤ b, c ≤ Y ≤ d) = f (x, y )dxdy
c a

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 5 / 19
Joint distributions
Let (X , Y ) be a joint random variable with uniform distribution on
[0, 2] × [0, 2]. Then find P(A) if A = {(x, y ) : x + y ≤ 2}.

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 6 / 19
Joint distributions
Let (X , Y ) be a joint random variable with uniform distribution on
[0, 2] × [0, 2]. Then find P(A) if A = {(x, y ) : x + y ≤ 2}.
Z
P(A) = f (x, y ) dx dy
A
Z 2 Z 2−y
1
= dxdy
0 0 4
1
= .
2

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 6 / 19
Joint distributions

Normalization property Let Ω = ΩX × ΩY . All joint pmfs and pdfs satisfy


X Z
f (x, y ) = 1 or f (x, y )dxdy = 1.
(x,y )∈Ω Ω

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 7 / 19
Joint distributions

Normalization property Let Ω = ΩX × ΩY . All joint pmfs and pdfs satisfy


X Z
f (x, y ) = 1 or f (x, y )dxdy = 1.
(x,y )∈Ω Ω

Problem Find the value of k for which


(
ke −x e −y 0 ≤ y ≤ x < ∞
f (x, y ) =
0 otherwise

is a joint pdf.

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 7 / 19
Joint distributions
Marginal pmf and pdf
The marginal pmf is defined by
X X
fX (x) = f (x, y ), fY (y ) = f (x, y )
y ∈ΩY x∈ΩX

and the marginal pdf is defined as


Z Z
fX (x) = f (x, y )dy , fY (y ) = f (x, y )dx
y ∈ΩY x∈ΩX

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 8 / 19
Joint distributions
Marginal pmf and pdf
The marginal pmf is defined by
X X
fX (x) = f (x, y ), fY (y ) = f (x, y )
y ∈ΩY x∈ΩX

and the marginal pdf is defined as


Z Z
fX (x) = f (x, y )dy , fY (y ) = f (x, y )dx
y ∈ΩY x∈ΩX

Joint Gaussian/normal random variable


A joint Gaussioan random variable (X , Y ) has a joint pdf given by

(x − µx )2 + (y − µY )2
 
1
f (x, y ) = exp −
2πσ 2 2σ 2

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 8 / 19
Joint distributions
Marginal pdfs of Gaussian:
Z ∞
fX (x) = f (x, y )dy
−∞

(x − µX )2 + (y − µY )2
Z  
1
= 2
exp −
−∞ 2πσ 2σ 2

(x − µX )2 (y − µY )2
  Z  
1 1
= √ exp − √ exp − dy
2πσ 2 2σ 2 −∞ 2πσ 2 2σ 2

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 9 / 19
Joint distributions
Marginal pdfs of Gaussian:
Z ∞
fX (x) = f (x, y )dy
−∞

(x − µX )2 + (y − µY )2
Z  
1
= 2
exp −
−∞ 2πσ 2σ 2

(x − µX )2 (y − µY )2
  Z  
1 1
= √ exp − √ exp − dy
2πσ 2 2σ 2 −∞ 2πσ 2 2σ 2

Thus
(x − µX )2
 
1
fX (x) = √ exp −
2πσ 2 2σ 2
(y − µY )2
 
1
fY (y ) = √ exp −
2πσ 2 2σ 2

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 9 / 19
Joint distributions
From the above derivation, note that

f (x, y ) = fX (x)fY (y ).

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 10 / 19
Joint distributions
From the above derivation, note that

f (x, y ) = fX (x)fY (y ).

Independent random variables


Random variables X and Y are independent if and only if

f (x, y ) = fX (x)fY (y )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 10 / 19
Joint distributions
From the above derivation, note that

f (x, y ) = fX (x)fY (y ).

Independent random variables


Random variables X and Y are independent if and only if

f (x, y ) = fX (x)fY (y )

QuestionHow to justify this definition?

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 10 / 19
Joint distributions
From the above derivation, note that

f (x, y ) = fX (x)fY (y ).

Independent random variables


Random variables X and Y are independent if and only if

f (x, y ) = fX (x)fY (y )

QuestionHow to justify this definition?


Sequence of independent random variables
A sequence of random variables X1 , X2 , . . . , Xn is independent if and only if

f (x1 , x2 , . . . , xn ) = fX1 (x1 )fX2 (x2 ) . . . fXn (xn )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 10 / 19
Joint distributions
From the above derivation, note that

f (x, y ) = fX (x)fY (y ).

Independent random variables


Random variables X and Y are independent if and only if

f (x, y ) = fX (x)fY (y )

QuestionHow to justify this definition?


Sequence of independent random variables
A sequence of random variables X1 , X2 , . . . , Xn is independent if and only if

f (x1 , x2 , . . . , xn ) = fX1 (x1 )fX2 (x2 ) . . . fXn (xn )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 10 / 19
Joint distributions
Terminology If n random variables are independent and have the same
distribution, then the random variables are called independent and
identically distributed (iid) random variables.

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 11 / 19
Joint distributions
Terminology If n random variables are independent and have the same
distribution, then the random variables are called independent and
identically distributed (iid) random variables.

Joint cdf
Let X and Y be random variables. The joint cdf of X and Y is defined as

F (x, y ) = P(X ≤ x ∩ Y ≤ y ).

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 11 / 19
Joint distributions
Terminology If n random variables are independent and have the same
distribution, then the random variables are called independent and
identically distributed (iid) random variables.

Joint cdf
Let X and Y be random variables. The joint cdf of X and Y is defined as

F (x, y ) = P(X ≤ x ∩ Y ≤ y ).

Obviously. X X
F (x, y ) = f (x ′ , y ′ )
y ′ ≤y x ′ ≤x

if X and Y are discrete, and


Z y Z x
F (x, y ) = f (x, y ) dx dy
−∞ −∞

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 11 / 19
Joint distributions
Observations
1 If X and Y are independent then

F (x, y ) = FX (x)FY (y )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 12 / 19
Joint distributions
Observations
1 If X and Y are independent then

F (x, y ) = FX (x)FY (y )

2 Let X and Y are iid with pdf Unif(0, 1). Then

F (x, y ) = xy

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 12 / 19
Joint distributions
Observations
1 If X and Y are independent then

F (x, y ) = FX (x)FY (y )

2 Let X and Y are iid with pdf Unif(0, 1). Then

F (x, y ) = xy

3 If X and Y are iid with pdf N (µ, σ 2 ) then


   
x −µ y −µ
F (x, y ) = Φ Φ
σ σ

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 12 / 19
Joint distributions

Observations
1 F (x, −∞) = 0

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 13 / 19
Joint distributions

Observations
1 F (x, −∞) = 0
2 F (−∞, y ) = 0

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 13 / 19
Joint distributions

Observations
1 F (x, −∞) = 0
2 F (−∞, y ) = 0
3 F (−∞, −∞) = 0

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 13 / 19
Joint distributions

Observations
1 F (x, −∞) = 0
2 F (−∞, y ) = 0
3 F (−∞, −∞) = 0
4 F (∞, ∞) = 1

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 13 / 19
Joint distributions

Observations
1 F (x, −∞) = 0
2 F (−∞, y ) = 0
3 F (−∞, −∞) = 0
4 F (∞, ∞) = 1

Marginal cdf
The marginal cdfs are

FX (x) = F (x, ∞), FY (y ) = F (∞, y )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 13 / 19
Joint distributions

Question How to obtain joint pdf from joint cdf?

∂2 ∂2
f (x, y ) = F (x, y ) = F (x, y )
∂y ∂x ∂x ∂y

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 14 / 19
Joint distributions

Question How to obtain joint pdf from joint cdf?

∂2 ∂2
f (x, y ) = F (x, y ) = F (x, y )
∂y ∂x ∂x ∂y

Example Let X and Y be random variables with joint cdf

F (x, y ) = (1 − e −λx )(1 − e −λy ), x ≥ 0, y ≥ 0

Then

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 14 / 19
Joint distributions

Question How to obtain joint pdf from joint cdf?

∂2 ∂2
f (x, y ) = F (x, y ) = F (x, y )
∂y ∂x ∂x ∂y

Example Let X and Y be random variables with joint cdf

F (x, y ) = (1 − e −λx )(1 − e −λy ), x ≥ 0, y ≥ 0

Then
f (x, y ) = λ2 e −λx e −λy

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 14 / 19
Joint distributions

Joint expectation
Let X , Y be random variables. Then the joint expectation of the pair is
defined as
P P
 y ∈ΩY x∈ΩX xy f (x, y ) if X , Y are discrete

E (XY ) =

R R
y ∈ΩY x∈ΩX xy f (x, y ) dx dy if X , Y are continuous

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 15 / 19
Joint distributions

Joint expectation
Let X , Y be random variables. Then the joint expectation of the pair is
defined as
P P
 y ∈ΩY x∈ΩX xy f (x, y ) if X , Y are discrete

E (XY ) =

R R
y ∈ΩY x∈ΩX xy f (x, y ) dx dy if X , Y are continuous

Question Why is the joint expectation defined as the product random


variable instead of addition (E (X + Y )) or difference (E (X − Y )) or
quotient (E (X /Y ))?

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 15 / 19
Joint distributions
Suppose X and Y are discrete random variables with range spaces
ΩX = {x1 , . . . , xn } and ΩY = {y1 , . . . , yn }.

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 16 / 19
Joint distributions
Suppose X and Y are discrete random variables with range spaces
ΩX = {x1 , . . . , xn } and ΩY = {y1 , . . . , yn }. Now define the vectors

x = [x1 , . . . , xn ]T , y = [y1 , . . . , yn ]T .

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 16 / 19
Joint distributions
Suppose X and Y are discrete random variables with range spaces
ΩX = {x1 , . . . , xn } and ΩY = {y1 , . . . , yn }. Now define the vectors

x = [x1 , . . . , xn ]T , y = [y1 , . . . , yn ]T .

Then define the pmf matrix as


 
f (x1 , y1 ) f (x1 , y2 ) . . . f (x1 , yn )
f (x2 , y1 ) f (x2 , y2 ) . . . f (x2 , yn )
P= .
 
.. .. .. ..
 . . . . 
f (xn , y1 ) f (xn , y2 ) . . . f (xn , yn )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 16 / 19
Joint distributions
Suppose X and Y are discrete random variables with range spaces
ΩX = {x1 , . . . , xn } and ΩY = {y1 , . . . , yn }. Now define the vectors

x = [x1 , . . . , xn ]T , y = [y1 , . . . , yn ]T .

Then define the pmf matrix as


 
f (x1 , y1 ) f (x1 , y2 ) . . . f (x1 , yn )
f (x2 , y1 ) f (x2 , y2 ) . . . f (x2 , yn )
P= .
 
.. .. .. ..
 . . . . 
f (xn , y1 ) f (xn , y2 ) . . . f (xn , yn )

Then
E (XY ) = x T Py ,
the weighted inner (scalar) product of x and y .

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 16 / 19
Joint distributions

For example, if ΩX = {1, . . . , n} = ΩY with


(
1
x =y
f (x, y ) = n
0 x ̸= y

then
1 1
P= I , E (XY ) = x T y
n n

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 17 / 19
Joint distributions
Recall that the cosine angle between x and y is defined as

xT y
cos θ =
∥x∥ ∥y ∥
qP qP
n 2 n 2
where ∥x∥ = i=1 xi and ∥y ∥ = i=1 yi .

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 18 / 19
Joint distributions
Recall that the cosine angle between x and y is defined as

xT y
cos θ =
∥x∥ ∥y ∥
qP qP
n 2 n 2
where ∥x∥ = i=1 xi and ∥y ∥ = i=1 yi .
Geometry of expectation: geometry defined by the weighted inner product
and weighted norm

where
x T Py E (XY )
cos θ = =p p
∥x∥P X ∥y ∥P Y E (X 2 ) E (Y 2 )
Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 18 / 19
Joint distribution
In the above,

E (X 2 ) = x T P X x = ∥x∥2P X
E (Y 2 ) = y T P Y y = ∥y ∥2P Y

where
   
p(x1 ) . . . 0 p(y1 ) . . . 0
PX =  ... .. ..  , P =  .. .. .. 

. .  Y  . . . 
0 . . . p(xn ) 0 . . . p(yn )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 19 / 19
Joint distribution
In the above,

E (X 2 ) = x T P X x = ∥x∥2P X
E (Y 2 ) = y T P Y y = ∥y ∥2P Y

where
   
p(x1 ) . . . 0 p(y1 ) . . . 0
PX =  ... .. ..  , P =  .. .. .. 

. .  Y  . . . 
0 . . . p(xn ) 0 . . . p(yn )

Obviously,
E (XY )
−1 ≤ p p ≤1
E (X 2 ) E (Y 2 )
due to Cauchy-Schwarz inequality:

(E (XY ))2 ≤ E (X 2 )E (Y 2 )

Bibhas Adhikari (Autumn 2022-23, IIT Kharagpur) Proability and Statistics Lecture 11 October 10, 2022 19 / 19

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy