0% found this document useful (0 votes)
48 views6 pages

Conditional Expectations Definition 4: E U (X) y U (X) F (X/y)

The document defines conditional expectations of random variables in terms of their conditional distributions. Conditional expectations are defined as the weighted average of the function values, where the weights are given by the conditional probability distribution or density. Conditional means and variances are then defined in terms of conditional expectations. Two examples are provided to illustrate calculating the conditional mean and variance from the joint probability distribution of two random variables.

Uploaded by

Kimondo King
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views6 pages

Conditional Expectations Definition 4: E U (X) y U (X) F (X/y)

The document defines conditional expectations of random variables in terms of their conditional distributions. Conditional expectations are defined as the weighted average of the function values, where the weights are given by the conditional probability distribution or density. Conditional means and variances are then defined in terms of conditional expectations. Two examples are provided to illustrate calculating the conditional mean and variance from the joint probability distribution of two random variables.

Uploaded by

Kimondo King
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

CONDITIONAL EXPECTATIONS

Conditional expectations of random are defined in terms of their conditional


distributions.
Definition 4
If X and Y are discrete random variables and f (x / y ) is the value of the conditional
probability distribution of X given Y = y at x, the conditional expectation of u( X )
given Y = y is
u( X )
E [ y ]
=∑ u ( x )⋅f ( x / y )
x

Correspondingly, if X and Y are continuous random variables and f (x / y ) is the


value of the conditional probability density of X given Y = y at x, the conditional
expectation of u( X ) given Y = y is
u( X )
E [ y ]∫
= u (x )⋅f ( x / y )dx
x

Similar expressions based on the conditional distribution or density of Y given X = x


define the conditional expectation of v (Y) given X = x.

If we let u( X )=X in Definition 4, we obtain the conditional mean of the random


variable X given Y = y, which we denote by
μ X / Y = y=E [ X Y=y]
Correspondingly, the conditional variance of the random variable X given Y = y, is
2
2
[
σ X / Y = y =E ( X − μ X / y ) / y =E
2
] [ X
Y= y ] 2
−μ X / y

where
E [ X
Y=y ] is given by Definition 4 with u( X )=X
2
.

Example 22
With reference to Example 1, find the conditional mean of X given Y = 1.
Solution
Making use of the results obtained in Example 10, that is
4 3
f (0/1)= , f (1/1)= , f (2/1)=0
7 7 , we get
X
E [ Y =1 ]
=∑ x⋅f ( x / Y =1)
x
4 3 3
= 0⋅ +1⋅ + 2⋅0=
7 7 7

Page 1 of 6
Example 23
If the joint probability density of X and Y is given by
2
3 {
f (x , y)=¿ ( x+2 y ) for 0<x<1, 0<y<1 ¿ ¿¿¿
1
Y=
Find the conditional mean and the conditional variance of X given 2 .
Solution
Performing the necessary integrations, we get
1 1
2 2 2
[ xy + y 2 ] y =0= 2 [ x +1 ]
1
g ( x ) =∫ f ( x , y ) dy= ∫ ( x+2 y ) dy= ∫ ( x+2 y ) dy=
y y=0 3 3 y=0 3 3

2
i.e.
3 {
g ( x ) =¿ ( x+1) , 0<x<1 ¿ ¿¿¿
Also
1 1 1
2 2 2 x2
h ( y )=∫ f ( x , y ) dx= ∫ ( x+2 y ) dx=
x x =0 3 3

x=0
( x +2 y ) dx=
3 2[ +2 xy ] x =0
2 1 1
=
3 2 [ ]
+2 y = ( 1+4 y )
3
1
i.e.
3 {
h ( y )=¿ ( 1+4 y ) , 0<y<1 ¿ ¿¿¿
Then, substituting into the formula for a conditional density, we get
2
( x +2 y )
x f (x , y) 3 2 x+ 4 y
f ( )
y
=
h( y)
=
1
( 1+4 y )
=
1+ 4 y
3

x 2x+4 y
i.e.,
f ( y) { =¿
1+4 y
, 0<x<1 ¿ ¿¿¿
so that

x 2
f
( ) Y=
1
=¿ { 3
( x+1 ) , 0<x<1 ¿ ¿¿¿
2

μX
Thus, Y =1/2 is given by

Page 2 of 6
1 1
X 2 2
μX
Y =1/2
=E ( ) 2
= ∫ x⋅ ( x +1)dx = ∫ ( x +x )dx
Y =1/2 x=0 3 3 x=0
1
2 x3 x2 2 1 1 5
= +
3 3 2[ ] [ ] x =0
= + =
3 3 2 9
Next we find
1 1
X2 2 2
μX 2

Y=1/2
=E ( ) 2 3 2
= ∫ x ¿ ( x +1)dx = ∫ ( x +x )dx
Y =1/2 x=0 3 3 x=0
1
2 x 4 x3 2 1 1 7
=
3 4 3[ ] [ ]
+
x=0
= + =
3 4 3 18
and it follows that

7 5 2 13
2
σX = −
Y=1/2
=
18 9 162 ()
EXERCISES
1
f (x , y )=
1. If X and Y have the joint probability distribution for x = -3 and y = 4
-5, x = -1 and y = -1, x = 1 and y = 1, and x = 3 and y = 5, find cov( X , Y ) . ⇒ [ 8 ]
2. Given the values of the joint probability distribution of X and Y shown in the
table:

x
0 1 2
0 1 1 1
12 6 24
y 1 1 1 1
4 4 40
2 1 1 0
8 20
3 1 0 0
120

Find the covariance and the correlation coefficient of X and Y.


3. If X and Y have the joint probability distribution
1 1 1 1
f (−1,0 )=0 , f (−1,1)= 4 , f (0,0)= 6 , f (0,1)=0 , f (1,0 )= 12 , and f (1,1)= 2 , show that:
(a) cov ( X , Y )=0 .
The two random variables are not independent
(b)
4. The joint cumulative density function of two random variables X and Y is given
by:

Page 3 of 6
−x2 − y2
{
F(x,y)=¿ (1−e ) ( 1−e ) for x>0, y>0 ¿ ¿¿¿
Find:
(a) The joint probability density function of the two random variables X and
Y.
−1 −4 2
(b) Pr ( 1< X≤2,1<Y ≤2 ) . ⇒ ( e −e ) =0 . 1222
(c) The covariance and the correlation coefficient of X and Y.

5. For k random variables


X 1 , X 2 ,⋯, X k , the values of their joint moment-
generating function are given by;
t x1+t 2 x2+⋯+t k xk
E (e 1 )
(a) Show for either the discrete case or the continuous case that the partial

derivative of the joint moment-generating function with respect to


ti at
t1 =t 2 =⋯=t k =0 is i . E( X )
(b) Show for either the discrete case or the continuous case that the second
partial derivative of the joint moment-generating function with respect to
ti and t j , i≠ j
at 1 2 k t =t =⋯=t =0
is i j . E( X X )
(c) If two random variables have the joint probability density given by:

f (x,y)=¿ {e−x−y for x>0, y>0 ¿ ¿¿¿


Find theirjoint moment-generating function and use it to determine the
values of E( XY ), E( X ), E(Y ), and cov ( X , Y )
1
[
⇒ E (e 1
t X +t 2 Y
)=
(1−t 1 )(1−t2 )
, E( XY )=1 , E ( X )=1 , E (Y )=1 , and cov ( X , Y )=0
]
6. If
X , X , and X
1 2 3 are independent and have the means 4, 9, and 3 and the
variances 3, 7, and 5, find the mean and the variance of:

(a)
Y=2 X 1 −3 X 2 +4 X 3 ;

(b) 1 Z=X +2 X − X
2 3
7. Repeat both parts of Exercise 6, dropping the assumption of independence and
using instead the information that
cov( X 1 , X 2 )=1, cov( X 2 , X 3 )=−2, and cov( X 1 , X 3 )=−3 . [(a) 143; (b) 54]
8. If the joint probability density of X and Y is given by
1
{
f (x , y)=¿ ( x+ y ) for 0<x<1, 0<y<2 ¿ ¿¿¿
3

Page 4 of 6
Find the variance of W=3 X +4 Y −5 .

9. If
var( X 1 )=5, var( X 2)=4 , var( X 3 )=7 , cov( X 1 , X 2 )=3, cov( X 1 , X 3 )=−2 , and
X 2 and X 3 are independent, find the covariance of
Y 1 =X 1−2 X 2+3 X 3 and
Y 2=−2 X 1 +3 X 2+4 X 3
. [75]
10. With reference to Exercise 6, find cov (Y , Z ) .
11. Given the values of the joint probability distribution of X and Y shown in the
table:
x
-1 1
-1 1 1
8 2
y 0 0 1
4
1 1 0
8
Find the conditional mean and the conditional variance of X given Y = -1.
3 16
[ ⇒ μX
Y =−1
2
= , and σ X =
5 Y=−1 25 ]
12. If the joint probability density of X and Y is given by:

f ( x , y ) =¿ 14 ( 2 x+y ) for 0<x<1, 0<y<2 ¿ ¿¿¿


{ 1
Find theconditional mean and the conditional variance of Y given
X =4 .
11 23
[ ⇒ μY
X=
1
4
=
9
2
, and σ Y
X=
1
=
4
81 ]
13. (a) Show that: (i)
( ( y x ))= E ( xy )
E x E

(ii)
( y x))= E ( y )
E E(

(iii)
( y x))+ var [ E ( y x)]
var ( Y )= E var (

2
(b) A random variable X has mean μ1 and variance σ 1 .A random variable
2
Y has mean μ2 and variance σ 2 . The correlation coefficient of X and Y

is ρ 12 . Given that
E ( y x )=a+ bx , find the constants a and b in terms
2 2
of μ1 , μ2 , σ1 , σ2 and ρ 12 .

Page 5 of 6
f (x)=¿ 1 for − 12 <x< 12 ¿ ¿¿¿
{
14. Let X have the marginal density
and let the conditional density of Y be

y
(){
f =¿ 1 , for x<y<x+1, −12 <x<0 ¿ −1, for −x<y<1−x, 0<x< 12 ¿ ¿¿¿
x
ρ 12 .
{
Find

Page 6 of 6

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy