0% found this document useful (0 votes)
117 views3 pages

Law of Total Variance

The document defines conditional probability, expectation, and variance given an event or random variable. It provides an example calculating the conditional variance of a random variable X given that X is greater than or equal to 1. The key steps are: (1) calculating the probability that X is greater than or equal to 1, (2) calculating the conditional expectation of X^2 given this event, and (3) using these to calculate the conditional variance as the conditional expectation of X^2 minus the square of the conditional expectation of X.

Uploaded by

Aslı Yörüsün
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
117 views3 pages

Law of Total Variance

The document defines conditional probability, expectation, and variance given an event or random variable. It provides an example calculating the conditional variance of a random variable X given that X is greater than or equal to 1. The key steps are: (1) calculating the probability that X is greater than or equal to 1, (2) calculating the conditional expectation of X^2 given this event, and (3) using these to calculate the conditional variance as the conditional expectation of X^2 minus the square of the conditional expectation of X.

Uploaded by

Aslı Yörüsün
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Laws of Total Expectation and Total Variance Example.

Example. For the previous example , calculate the conditional variance Var(X|X ≥ 1)

Solution.
Definition of conditional density. Assume and arbitrary random variable X with density
We already calculated E(X 2 | X ≥ 1). We only need to calculate E(X | X ≥ 1).
fX . Take an event A with P (A) > 0. Then the conditional density fX|A is defined as follows: ( )
∫ ∫ ∫
 1 256 4 1 3
 f (x) E(X | X ≥ 1) = x f (x) dx = x x dx
P (A) x∈A P (X ≥ 1) {x≥1} 255 1 64
fX|A (x) =
 0 x∈ ∫ ( ) ( )( )[
/A 256 4 1 4 256 1 1 5 ]x=4 4096
= x dx = x =
Note that the support of fX|A is supported only in A. 255 1 64 255 64 5 x=1 1275
Finally: ( )
8192 4096 2 630784
Var(X|X ≥ 1) = E(X 2 |X ≥ 1) − E(X|X ≥ 1)2 = − =
765 1275 1625625
Definition of conditional expectation conditioned on an event.
∫ ∫
1
E(h(X)|A) = h(x) fX|A (x) dx = h(x) fX (x) dx Definition of conditional expectation conditioned on a random variable. Suppose two
A P (A) A
random variables X and Y . To define E(X|Y ) we need the conditional density function. For
Example. For the random variable X with density function this , let fX,Y (x, y) be the joint density of the pair {X, Y }. Thsn the conditional density fX|Y

 1 x3 0<x<4 is defined 
64
f (x) =
 0 

fX,Y (x,y)
fY (y) > 0
otherwise 
 fY (y)
fX|Y (x|y) =
calculate E(X 2 | X ≥ 1). 


 0 fY (y) = 0
∫∞
Solution. Then E(h(X) | Y ) is defined to be the random variable that assigns the value −∞ h(x) fX|Y (x|y) dx

Step 1. to y in the continuous case, and assigns the value h(x) fX|Y (x|y) in the discrete case.
∫ 4
1 3 1 [ 1 4 ]x=4 255 x
P (X ≥ 1) = x dx = x =
1 64 256 4 x=1 256
Example. Take the following joint density:
Step 2. ( )
∫ ∫ 4
1 256 1 3
E(X 2 | X ≥ 1) = x2 f (x) dx = x2 x dx
P (X ≥ 1) {x≥1} 255 1 64
X=1 X=2 X=3 fY (y)
∫ 4( ) ( )( )[
1 6 ]x=4 8192
Y =1 0.07 0.1 0.23 0.4
256 1 5 256 1
= x dx = x =
255 1 64 255 64 6 x=1 765 Y =2 0.25 0.08 0.07 0.4
Y =3 0.05 0.04 0.11 0.2
Definition of conditional variance conditioned on an event. fX (x) 0.37 0.22 0.41 1

Var(X|A) = E(X 2 |A) − E(X|A)2 Describe the random variable E(Y 2 |X)

1 2
∑ ∑
Solution. E(Y 2 |X = 1) = y 2 fY |X (y|1) = y 2 f (X=1 , Y =y)
fX (1)
BX taking E of both sides we get:
y y

1 ∑ 2 1 { 2 } 1.52 E[Var(X|Y )] = E[E[X 2 |Y ]] − E[{E(X|Y )}2 ]


= y f (X = 1 , Y = y) = (1) (0.07) + (2)2 (0.25) + (3)2 (0.05) =
fX (1) y 0.37 0.37

Similarly: = E(X 2 ) − E[{E(X|Y )}2 ] law of iterated expectations


0.78
E(Y |X = 2) =
2
{ } { }
0.32
1.5 = E(X 2 ) − {E(X)}2 − E[{E(X|Y )}2 ] − {E(X)}2
E(Y 2 |X = 3) =
0.41
{ }
So then:  = Var(X) − E[{E(X|Y )}2 ] − {E[E(X|Y )]}2

 1.52
with probability P (X = 1) = 0.37

 0.37
E(Y |X) =
2 0.78
with probability P (X = 2) = 0.22

 0.32 = Var(X) − Var[E(X|Y )]

 1.5
0.41 with probability P (X = 3) = 0.41
By moving terms around , the claim follows.

Law of Total Expectation. Note: Using similar arguments , one can prove the following:
E(X) = E(E[X|Y ])
Example (from the Dean’s note): Two urns contain a large number of balls with each ball

Law of Total Variance. marked with one number from the set {0, 2, 4}. The proportion of each type of ball in each urn

( ) ( ) is displayed in the table below:


Var(X) = E Var[X | Y ] + Var E[X | Y ]
Number on Ball
0 2 4
Proof. By definition we have
A 0.6 0.3 0.1

Var(X|Y ) = E(X 2 |Y ) − {E(X|Y )}2 B 0.1 0.3 0.6

An urn is randomly selected and then a ball is drawn at random from the urn. The number on
the ball is represented by the random variable X.

(a) Calculate the hypothetical means (or conditional means)

E[X|θ = A] and E[X|θ = B]

(b) Calculate the variance of the hypothetical means: Var[E[X|θ]] .

3 4
(c) Calculate the process variances (or conditional variances)

Var[X|θ = A] and Var[X|θ = B]

(d) Calculate the expected value of the process variance: E[Var[X|θ]] .

(e) Calculate the total variance (or unconditional variance) Var[X] and show that it equals
the sum of the quantities calculated in (b) and (d) .

Solution: Part (a)

E[X|θ = A] = (0.6)(0) + (0.3)(2) + (0.1)(4) = 1.0

E[X|θ = B] = (0.1)(0) + (0.3)(2) + (0.6)(4) = 3.0

Part (b)
1 1
E[X] = E[E[X|θ]] = ( )(1.0) + ( )(3.0) = 2.0
2 2
1 1
Var[E[X|θ]] = ( )(1.0 − 2.0) + ( )(3.0 − 2.0)2 = 1.0
2
2 2
Part (c)

Var[X|θ = A] = (0.6)(0 − 1.0)2 + (0.3)(2 − 1.0)2 + (0.1)(4 − 1.0)2 = 1.8

Var[X|θ = B] = (0.1)(0 − 3.0)2 + (0.3)(2 − 3.0)2 + (0.6)(4 − 3.0)2 = 1.8

Part (d)
1 1
E[Var[X|θ]] = ( )(1.8) + ( )(1.8) = 1.8
2 2
Part (e)
[ ]
Var[X] = 1
2 (0.6)(0 − 2.0)2 + (0.3)(2 − 2.0)2 + (0.1)(4 − 2.0)2
[ ]
2 (0.1)(0 − 2.0) + (0.3)(2 − 2.0) + (0.6)(4 − 2.0)
1 2 2 2
+

= 2.8

⇒ Var(X) = Var[E[X|θ]] + E[Var[X|θ]]

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy