0% found this document useful (0 votes)
279 views6 pages

Chebyshevs Inequality - Problemspdf

This document provides solutions to homework problems involving probability and statistics concepts. 1. The first problem uses Chebyshev's inequality to bound the probability that a random variable X falls between two values, given its mean and variance. 2. The second problem examines how Chebyshev's inequality can provide loose bounds by considering four specific distributions and calculating their actual probabilities. 3. The third problem derives two generalized inequalities that can provide tighter bounds than Chebyshev's inequality by using higher moments or the moment generating function. 4. The final problem asks to compute the probability that two random variables X and Y fall within certain ranges, given their joint probability density function.

Uploaded by

amreen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
279 views6 pages

Chebyshevs Inequality - Problemspdf

This document provides solutions to homework problems involving probability and statistics concepts. 1. The first problem uses Chebyshev's inequality to bound the probability that a random variable X falls between two values, given its mean and variance. 2. The second problem examines how Chebyshev's inequality can provide loose bounds by considering four specific distributions and calculating their actual probabilities. 3. The third problem derives two generalized inequalities that can provide tighter bounds than Chebyshev's inequality by using higher moments or the moment generating function. 4. The final problem asks to compute the probability that two random variables X and Y fall within certain ranges, given their joint probability density function.

Uploaded by

amreen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Math 149A HW 5

Solutions

1. Suppose that X is a random variable such that E(X) = 13 and V ar(X) = 25. Using
Chebyshev’s inequality, give a bound on P (5 < X < 21).

Solution: This is similar to the last example from class on 10/30. The idea is to write
P (5 < X < 21) = P (|X − 13| < 8) = 1 − P (|X − 13| ≥ 8).
Chebyshev’s inequality states that
1
P (|X − µ| ≥ kσ) ≤
k2
In this case σ = 5, and we want kσ = 8. So we’ll take k = 1.6, giving a bound
1
P (|X − 13| ≥ 8) ≤ ≈ 0.391
1.62
This means that
P (5 < X < 21) ≥ 1 − .391 ≈ 0.609

Remark: Notice that the inequality sign flipped when I did the subtraction!

2. Chebyshev’s inequality (with k = 2) says that for any variable X with mean µ and
variance σ 2 , we have
P(|X − µ| ≥ 2σ) ≤ 1/4.
However, ”at most” is not the same as equal, and the actual probability may be very different.
For each of the following four distributions, determine the actual probability that |X − µ| ≥
2σ.
1
Part a: X is continuous, with pdf given by f (x) = 2
for 0 ≤ x < 2, and 0 otherwise.

Solution: We have
2
x2
Z
x
µ = dx = |20 = 1
0 2 4
2 2
x3
Z
x 4
E(X 2 ) = dx = |20 =
0 2 6 3
1
V ar(X) = E(X 2 ) − [E(X)]2 =
3
1 2
So σ = √3 . The event we care about is |X − 1| ≥ √3 , which corresponds to either
2 2
X ≥ 1 + √ ≈ 1.15 or X ≤ 1 − √ ≈ −0.15
3 3
but X is never in either of these ranges, because X is always between 0 and 2. So the
probability is 0 .
1
Remark: One way you could have run into trouble here is to try writing
2 2 2
P (|X − 1| ≥ √ = 1 − P (1 − √ < X < 1 + √ )
3 3 3
Z 1+ √2
3 1
= 1− dx
1− √2 2
3

The problem with this is that the pdf of X is not 12 everywhere – only between 0 and 2. So
you can’t just plug in 21 for the pdf without checking where x is (this also shows up in part
b). If you work out numerically the value you get by doing this integral, you’ll end up with
a negative number for your answer. This should be a sign that something has gone wrong!
Part b: X is continuous, with pdf given by f (x) = 2e2x for 0 ≤ x and 0 otherwise (note:
You computed µ and σ for this distribution on the last homework already!)
Solution: On the last homework, you found µ = 12 , σ 2 = 41 . So we want P (|X − 21 | ≥ 1).
Since X is always non-negative, this corresponds to the event X ≥ 32 . This is
3 3
P (X ≥ ) = 1 − P (X < )
2 2
Z 3/2
= 1− 2e−2x dx
0
 3/2
= 1 − −e−2x |0
= 1 − 1 − e−3 = e−3 ≈ 0.050


Part c: X is discrete, with pmf given by p(1) = p(−1) = 12 , and 0 everywhere else.
Solution: We have
1 1
E(X) = − = 0,
2 2
1 2 1
V ar(X) = (1) + (−1)2 = 1
2 2
But this means X is always within 2σ of the mean, so the probability is 0 .
1 3
Part d: X is discrete, with pmf given by p(4) = p(−4) = 8
, p(0) = 4
, and p(x) = 0
everywhere else
Solution: We have
1 1
E(X) = 4( ) + (−4)( ) = 0
8 8
1 1
V ar(X) = (4 − 0)2 ( ) + (−4 − 0)2 ( ) = 4
8 8
σ = 2

So the event ”|X − µ| ≥ 2σ” corresponds to the event ”|X| ≥ 4”. Of the possible values of
1
X, this event holds when x = 4 or x = −4. So the probability we want is p(4) + p(−4) = .
4
2
Note that for all four of these distributions, Chebyshev’s inequality says the same thing:
P (|X − µ| ≥ 2σ) ≤ 14 . But the actual probabilities varied quite a bit from part to part. In
general, Chebyshev’s inequality is only a bound on the probability X is far away from the
mean.
For some distributions, the actual probability is much smaller than you’d guess just by look-
ing at Chebyshev’s inequality. But, as part d shows, there are situations where Chebyshev’s
inequality is actually tight (an equality). So if you want to beat Chebyshev’s inequality, you
need to look beyond the mean and variance, which is where problem 3 comes into play.
3a: Let X be any random variable such that E(X − µ)4 exists. Show that for any c > 0, we
have
E(X − µ)4
P (|X − µ| > c) ≤
c4
Solution: We follow the same idea as when we proved Chebyshev’s inequality: First write
everything in terms of the thing we know the expectation of, then apply Markov’s inequality.
So we write
P (|X − µ| > c) = P ((X − µ)4 > c4 ) ≤ P ((X − µ)4 ≥ c4 )
Since (X − µ)4 is non-negative, we can use Markov’s inequality, which tells us that
E(X − µ)4
P ((X − µ)4 ≥ c4 ) ≤
c4
this is exactly what we want.
3b: Let X be any random variable whose mgf M (t) is defined for all positive t. Show that
for any a > 0 and any t, we have
P (X > a) ≤ e−at M (t)
Solution: The right hand side here is
M (t) E(etX )
=
eat eat
In this form it’s starting to look like part a. So our goal will be to use Markov’s inequality,
applied to Y = etX . Again, we start by writing things in terms of Y :
P (X > a) = P (etX > eat ) ≤ P (Y ≥ eat )
Here it’s important that t is positive, so that etX is an increasing function of X and doesn’t
flip the inequality. By Markov’s inequality, we have
E(Y ) M (t)
P (Y ≥ eat ) ≤ at = at ,
e e
and again we’re done.
Remark: Chebyshev’s inequality says if the variance is small, a variable is usually close to
the mean. These inequalities say something similar, but rely on you knowing the fourth
moment or the mgf. In some cases Chebyshev’s inequality can be very far off (give you a
bound that’s far away from the actual probability), but these bounds can give you a much
better estimate.
3
4. Let X and Y be two random variables whose joint pdf is given by
 6 2
5
(x + y) if 0 ≤ x, y < 1
f (x, y) =
0 otherwise

1 2

Part a: Compute P 0 ≤ X ≤ 2
and 3
≤Y ≤1

Solution: This is
Z 1/2 Z 1
6 2
(x + y) dy dx
0 2/3 5
Z 1/2 
y 2

6 2
= x y+ 1 dx
0 5 2 y=2/3
Z 1/2  2 
6 x 5
= + dx
0 5 3 18
 1/2
6 x3 5x

= +
5 9 18 0
11
=
60

Part b: Compute P (Y ≤ X)

Solution: This is
Z 1 Z 1
6 2
(x + y) dy dx
0 x 5
Z 1 
y 2

6 2
= x y+ 1 dx
0 5 2 y=x
Z 1 
x2

6 2 1 3
= x + −x − dx
0 5 2 2
 1
6 x3 x x4

= + −
5 6 2 4 0
1
=
2

Part c: Compute the cdf of (X, Y ).


4
Solution: This is
Z x Z y
6 2
P (X ≤ x and Y ≤ y) = (s + t) dt ds
0 0 5
Z x 
t2

6 2
= s t+ y dx
0 5 2 t=0
Z x 
y2

6 2
= s y+ dx
0 5 2
 x
6 s3 y sy 2

= +
5 3 2 s=0
6 x3 y xy 2 2x3 y + 3xy 2
 
= + =
5 3 2 5

Part d: Compute the margnial (individual distribution) of X, then do the same for Y .
Solution: For X, the pdf is
Z 1  1
y 2

6 2 6 2 6 3
f (x) = (x + y)dy = x y+ = x2 + for 0 ≤ x ≤ 1
0 5 5 2 y=0 5 5

For Y , the pdf is
1  1
x3
Z 
6 2 6 2 6
f (y) = (x + y)dx = + xy = + y for 0 ≤ y ≤ 1
0 5 5 3 x=0 5 5
(in both cases the pdf is 0 outside this range).
5. Let X and Y be two variables whose pdf is given by
 −(x+y)
e if x, y ≥ 0
f (x, y) =
0 else

Part a: Compute P (X + Y ≤ 1).


Solution: If X + Y ≤ 1, then 0 ≤ Y ≤ 1 − X. So the probability in question is
Z 1 Z 1−x
e−(x+y) dy dx
0 0
Z 1
1−x
= −e−(x+y) y=0 dx
0
Z 1
= e−x − e−1 dx
0
1
= −e−x − e−1 x x=0 ( since e−1 is a constant )
2
= 1−
e

Part b: Suppose that Z = X + Y . Compute the cdf of Z.


5
Solution: We want P (X + Y ≤ z), so we do the same calculation as part a, but with the
1s in the upper limits replaced by z, getting
Z z Z z−x
P (Z ≤ z) = e−(x+y) dy dx
Z0 z 0
z−x
= −e−(x+y) y=0 dx
Z0 z
= e−x − e−z dx
0
z
= −e−x − e−z x x=0 ( since e−z is a constant )
= 1 − e−z − ze−z

Part c: What is the pdf of Z?


Solution: Now that we have the cdf, we just need to take a derivative using the product
rule.
f (z) = F 0 (z) = e−z − (e−z − ze−z ) = ze−z

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy