0% found this document useful (0 votes)
48 views46 pages

Chapter 5

This document covers key concepts in joint distributions of random variables including: - Definitions of joint cumulative distribution functions (CDFs), probability mass functions (PMFs), and probability density functions (PDFs) - Marginal distributions and how they relate to joint distributions - Independence of random variables and how it simplifies joint distributions - Conditional distributions and how to compute them from joint distributions - Distributions of sums and functions of independent random variables - Expectations and conditional expectations of random variables The document provides numerous examples to illustrate these concepts.

Uploaded by

Eva Chung
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views46 pages

Chapter 5

This document covers key concepts in joint distributions of random variables including: - Definitions of joint cumulative distribution functions (CDFs), probability mass functions (PMFs), and probability density functions (PDFs) - Marginal distributions and how they relate to joint distributions - Independence of random variables and how it simplifies joint distributions - Conditional distributions and how to compute them from joint distributions - Distributions of sums and functions of independent random variables - Expectations and conditional expectations of random variables The document provides numerous examples to illustrate these concepts.

Uploaded by

Eva Chung
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

Chapter

5. Joint distribu3ons

5.1 Jointly distributed random variables


The (joint) cumula3ve distribu3on func3on (cdf) of the
two r.v. X and Y is defined as the bivariate func3on


The marginal distribu3ons of X and Y are given by


1
Joint cdf











Joint pmf

Joint probability mass func3on: Let X and Y be discrete


random variables, taking on values
and respec3vely. The joint probability
mass func3on of (X,Y) is defined as

The marginal pmf of X and of Y can be recovered as





3
Joint pdf

Joint probability density func3on: two random variables


X and Y are said to be jointly con3nuous if there exists
a func3on f(x,y) defined for all reals x and y, having
the property that for every subset (C is a
subset of the two-dimensional plane):


f(x,y) is called the joint density func3on of X and Y.
The marginal pdf for X can be obtained from (and
similarly for Y):

4
Joint pmf


Example: A fair coin is tossed three 3mes independently; let X
denote the number of Heads in the first toss, and Y denote
the total number of Heads. Find the joint pmf of X and Y,
together with the marginal pmf of X and Y.

Solu3on: The joint and marginal pmf are given in the following
table:
Y
X 0 0 head 1 1 head 2 2 head 3 3 head p(x)
THT/TTH THH
NO-1st Head 0 1/8 TTT 1/4 1/8 0 Don’t exist 1/2
HTT/HTH
YES- 1st Head 1 0 Don’t exist 1/8 HTT 1/4 1/8 HHH 1/2
p(y) 1/8 3/8 3/8 1/8

5
Mul3nomial distribu3on

One of the most important discrete joint distribu3ons is


the Mul.nomial distribu.on: A sequence of n
independent and iden3cal experiments is performed,
each resul3ng in any one of r possible outcomes, with
respec3ve probabili3es
Let denote the number, among the n
experiments, which result in outcome i, then


Mul3nomial distribu3on

Example: Suppose that a fair die is rolled 9 3mes. What


is the probability that 1 appears three 3mes, 2 and 3
twice each, 4 and 5 once each, and 6 not at all?

Solu3on:
Joint pdf

Example: The joint density func3on of X and Y is equal to




Compute:
(a) P{X>1,Y<1}
(b) P{X<a}
(c) P{X<Y}


8
Independence of random variables
Independent random variables
Defini3on: X and Y are said to be independent if for any
two sets of real numbers A and B,

It can be shown that (*) holds iff for all a and b,

When X and Y are discrete, it is equivalent to:

In the con3nuous case, it is equivalent to:

9
Independence of random variables

Example: A fair die is rolled twice. Let X be the outcome


of the first roll, and Z be the sum of the two rolls. Are
X and Z independent?

Solu3on: We showed in a previous example (Chapter 2)
that the events {X=4} and {Z=6} are dependent, thus X
and Z are not independent (even though {X=4} and
{Z=7} are independent, for example).

10
Independence of random variables

Example: (“thinning” of Poisson distribu3on) Suppose


that the number of people who enter a post office on
a given day is a Poisson random variable with
parameter . Each person entering the post office is a
man with probability p, and a woman with probability
1-p.
Show that the number of men and women who enter
the post office are independent Poisson r.v., with
parameters and , respec3vely.
Independence of random variables

Proposi3on: Two con3nuous (resp., discrete) random


variables X and Y are independent if and only if their
joint probability density (resp., mass) func3ons sa3sfy

for all real numbers x,y.
Independence of random variables

Example: If the joint density func3on of X and Y is


(a)
(b)
and equal to 0 otherwise, are X and Y independent?

Solu3on: posi3ve domain in (b):


(a) Independent
(b) Not independent
Independence of random variables

Example: Suppose that are independent, and


distributed as , respec3vely.
Find the probability that .


Independence of random variables

General defini3on of independence: are
said to be independent if: for all sets ,


15
Sum of independent random variables
Sum of independent random variables
When X and Y are independent, the distribu3on of X+Y
can be computed from the distribu3ons of X and Y:

16
Sum of independent random variables

Differen3a3ng both sides w.r.t. a, we get:







The integral above is called the convolu3on of the
func3ons and .

17
Independent normal r.v.

Proposi3on: If are independent random


variables which are normally distributed, with
parameters then is also
normally distributed, with parameters and

18
Independent Poisson / binomial

Other important results: If X and Y are independent Poisson


random variables with respec3ve parameters ,
then X+Y is also a Poisson random variable, with parameter

If X and Y are independent binomial random variables with
respec3ve parameters , then X+Y is also a
binomial random variable, with parameters .

19
5.2 Condi3onal Distribu3ons

Discrete case:
Defini3on: If X and Y are two discrete random variables,
the condi3onal probability that X = x given that Y = y
(condi.onal pmf) is


Remark: If X and Y are independent random variables,
then the condi3onal probability mass func3on is
simply equal to the uncondi3onal one.

20
Condi3onal pmf

Example: Suppose that p(x,y), the joint probability mass


func3on of X and Y, is given by p(0,0)=.4, p(0,1)=.2,
p(1,0)=.1 and p(1,1)=.3. Calculate the condi3onal
probability mass func3on of X, given that Y=1.

21
Condi3onal pmf

Example: If X and Y are independent Poisson random


variables, with parameters respec3vely,
calculate the condi3onal distribu3on of X, given that
X+Y=n.




22
Condi3onal pdf

Con3nuous Case:
Defini3on: If X and Y have a joint probability density
func3on f(x,y), then the condi3onal pdf of X, given
that Y=y, is defined by


for all values of y such that .

If X and Y are independent, .



23
Condi3onal pdf

If X and Y are jointly con3nuous, then for any set A,



In par3cular, for , we can define the


condi3onal cdf of X, given that Y=y, by


Condi3onal pdf

Example: The joint density func3on of X and Y is given by




Find f(x|y).

25
Condi3onal pdf

Example: The joint density func3on of X and Y is given by




Compute P(X>1|Y=y).

26
Func3ons of r.v.

5.3 Joint distribu3on of func3ons of random variables


Let be jointly con3nuous random variables


with joint pdf . We want to compute the density
func3on of


Func3ons of r.v.
Assume that the func3ons sa3sfy the
following condi3ons:
1.  can be uniquely
solved for to get
2.  The func3ons have con3nuous par3al
deriva3ves, and the determinant of the following
determinant

Then
5.4 Expecta3on, condi3onal expecta3on


Discrete:

Con3nuous:

Since E[X] is a weighted average of possible values of X,

Recall:



29
Expecta3on, condi3onal expecta3on
A two-dimensional analog is the following:
Proposi3on:
Expecta3on, condi3onal expecta3on

By induc3on, we have

Example: (Sample mean). Let be independent
and iden3cally distributed (i.i.d.) random variables
having cdf F and expected value . Such a sequence
of r.v. is said to be a random sample from F. The
sample mean is defined by
Compute its expecta3on.
Expecta3on, condi3onal expecta3on
Covariance
The covariance between two random variables is a
measure of how they are related.
Defini3on: The covariance between X and Y, denoted by
Cov(X,Y), is defined by Cov(X,Y)=E[(X−E[X])(Y−E[Y])].

Interpreta3on: When Cov(X,Y)>0, higher than expected


values of X tend to occur together with higher than
expected values of Y. When Cov(X,Y)<0, higher than
expected values of X tend to occur together with
lower than expected values of Y.
32
Expecta3on, condi3onal expecta3on
By expanding the right-hand side of the defini3on of the
covariance, we see that



If X and Y are independent, then

33
Expecta3on, condi3onal expecta3on

Defini3on: If Cov(X,Y)=0, we say that X and Y are


uncorrelated. If Cov(X,Y)>0, we say that X and Y are
posi3vely correlated. If Cov(X,Y)<0, we say that X and
Y are nega3vely correlated.

So the previous calcula3on tells us that independence


implies uncorrelatedness.

Proposi3on: If X and Y are independent, then for any


two func3ons g and h, g(X) and h(Y) are independent.

34
Expecta3on, condi3onal expecta3on
Proposi3on:

35
Expecta3on, condi3onal expecta3on
Correla3on:

The correla3on is always between -1 and 1. If X and Y
are independent, then . But the converse
is not true. Generally, the correla3on (as well as
covariance) is a measure of the degree of dependence
between X and Y.

Note that for all a>0, b>0,




Expecta3on, condi3onal expecta3on

Example: Let be independent and iden3cally


distributed random variables with expected value and
variance , and let be the sample mean.

The random variable is called the


sample variance.

Find

37
Expecta3on, condi3onal expecta3on

Example: Let and be indicator variables for the


events A and B. Find .

Thus two events are independent if and only if the


corresponding indicator variables are uncorrelated.
In other words, for indicator variables, independence
and uncorrelatedness are equivalent.


Expecta3on, condi3onal expecta3on

Example: Let be independent and iden3cally


distributed random variables having variance .
Show that .

39
Bivariate normal distribu3on

Bivariate normal distribu3on


Defini3on: The joint density for a bivariate normal
distribu3on is




Bivariate normal distribu3on
Remarks on bivariate normal random variables (X,Y):
(a)  Marginally,
(b)  Condi3onally,
(c) 
(d)  Linear combina3ons of X and Y are normal random
variables, even though X and Y are not independent
when
(e)  Two normal random variables are independent iff
they are uncorrelated.

Bivariate normal distribu3on

Example: For bivariate normal random variables X and Y


with parameters , find P(X<Y).
Condi3onal expecta3on

Condi3onal expecta3on
Recall that if X and Y are joint discrete random variables,
it is natural to define the condi3onal expecta3on of X
given Y = y as:

43
Condi3onal expecta3on

For con3nuous random variables:


The condi3onal expecta3on of X, given that Y = y, is


44
Condi3onal expecta3on

Example: Let X and Y have the joint pdf



Find the condi3onal expecta3on E[X |Y = y].





Condi3onal expecta3on

Condi3onal variance:
The condi3onal variance of X|Y= y is the expected
squared difference of the random variable X and its
condi3onal mean, condi3oning on the event Y = y:


Similar to the uncondi3onal case, we can show

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy