0% found this document useful (0 votes)
74 views4 pages

Var 3 2 3 1 3 Var 3

This document is a problem set from a probability and random processes course at UC Berkeley. It contains 8 problems involving concepts like the Chebyshev inequality, probability mass functions, conditional probability, independence, expectations, and distributions of random variables. The problems require calculating probabilities, distributions, means, variances, and applying theorems like total probability and Bayes' rule to random processes and variables.

Uploaded by

Anteneh Ad
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
74 views4 pages

Var 3 2 3 1 3 Var 3

This document is a problem set from a probability and random processes course at UC Berkeley. It contains 8 problems involving concepts like the Chebyshev inequality, probability mass functions, conditional probability, independence, expectations, and distributions of random variables. The problems require calculating probabilities, distributions, means, variances, and applying theorems like total probability and Bayes' rule to random processes and variables.

Uploaded by

Anteneh Ad
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

UC Berkeley

Department of Electrical Engineering and Computer Science

EE 126: Probablity and Random Processes

Problem Set 5
Spring 2007

Issued: Thursday, March 8, 2007 Due: Thursday, March 15, 2007

Problem 5.1

1. N = 200, 000.

2. N = 100, 000.

Problem 5.2

a)This is a straight application of the Chebychev inequality. Chebychev tells us that:


P (|X − 7| ≥ 3) ≤ V3ar 2 1
2 = 3 and therefore that P (4 < X < 10) ≥ 3 .
V ar
b)If the variance = 9, P (|X − 7| ≥ 3) ≤ 32 = 1 and therefore that P (4 < X < 10) ≥ 0.

Problem 5.3

(a) 
1/4k if k = 1, 2, 3, 4 and n = 1, . . . , k
pN,K (n, k) =
0 otherwise

(b) 

 1/4 + 1/8 + 1/12 + 1/16 = 25/48 n=1
 1/8 + 1/12 + 1/16 = 13/48 n=2


pN (n) = 1/12 + 1/16 = 7/48 n=3
 1/16 = 3/48 n=4



0 otherwise

(c) The conditional PMF



 6/13
 k=2
pN,K (2, k)  4/13 k=3
pK|N (k|2) = =
pN (2) 
 3/13 k=4
0 otherwise

(d) Let A be the event that Chuck bought at least 2 but no more than 3 books, E[K|A] = 3
var(K|A) = 35
21
(e) E[T ] = 4

1
Problem 5.4

1.  1

 20 , x=1
3



 20 , x=2

3 13
pX (x) = 10 , x=3 , E[X] =

1
4
, x=4




 2

0 , otherwise

2.
1


 10 , w=2
 1
, w=3




 20
 7

20 , w=4
pW (w) = 3


 10 , w=6
1

, w=8




 5
0 , otherwise

3. E[R] = 34 , var(R) = 63
80

3
4. σR|A = 4

Problem 5.5

1. µ = 3, σ 2 = 2

2. µ = 5, σ 2 = 20.

Problem 5.6

1. To calculate the probability of heads on the next flip, we use the continuous version of
the total probability theorem:

Z 1
P (H) = P (H|P = p) · fP (p)dp
0
Z 1
= p2 ep dp
0
= e−2

2
2. Here we need to use the continuous version of Bayes’ theorem:

P (A|p)fP (p)
fP |A (p|A) = R1
P (A|p)fP (p)dp
(0
p2 ep
e−2 , 0≤p≤1
=
0, otherwise

3. Now we apply the above result to the technique used in (a):

Z 1
P (H) = P (A|P = p) · fP |A (p|A)dp
0
Z 1
1
= p3 ep dp
e−2 0
1
= · (6 − 2e)
e−2
.564
=
.718
≈ .786

Problem 5.7

Note that we can rewrite E[X1 | Sn = sn , Sn+1 = sn+1 , . . .] as follows:

E[X1 | Sn = sn , Sn+1 = sn+1 , . . .]


= E[X1 | Sn = sn , Xn+1 = sn+1 − sn , Xn+2 = sn+2 − sn+1 , . . .]
= E[X1 | Sn = sn ],

where the last equality holds due to the fact that the Xi ’s are independent.
We also note that

E[X1 + · · · + Xn | Sn = sn ] = E[Sn | Sn = sn ] = sn

It follows from the linearity of expectations that

E[X1 + · · · + Xn | Sn = sn ] = E[X1 | Sn = sn ] + · · · + E[Xn | Sn = sn ]

Because the Xi ’s are identically distributed, we have the following relationship:

E[Xi | Sn = sn ] = E[Xj | Sn = sn ], for any 1 ≤ i ≤ n, 1 ≤ j ≤ n.

Therefore,
sn
E[X1 + · · · + Xn | Sn = sn ] = nE[X1 | Sn = sn ] = sn ⇒ E[X1 | Sn = sn ] = .
n

3
Problem 5.8

a) γ is determined because the density function must integrate to 1. Since (X, Y ) uni-
formly distributed in R, we have: ZZ
γdxdy = 1

where the integral is over the area of R. Therefore γ1 =Area R.


b) Showing independence is equivalent to showing that:

fX,Y (x, y) = fX (x) · fY (y)

But this is clear, since:


fX,Y (x, y) = fY (x) = fY (y) = 1
c) Consider a region that consists of the upper half plane, and the point (1, −1). If we are
told that Y < 0, X is determined, and hence X, Y cannot be independent in this region.
d) To find the probability that (X, Y ) lie in the circle C inscribed in the R in part (b) we
could integrate, or observe that the integral will in fact come out to the area of the circle,
and hence the desired probability will be the ratio of the are of the circle to the area of the
square:
.25π
P ((X, Y ) ∈ C) = .
1

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy