0% found this document useful (0 votes)
6 views4 pages

Ch4 Handout

Chapter 4 discusses various probability inequalities, including Markov's, Chebyshev's, and Hoeffding's inequalities, which are essential for bounding quantities in statistics. It also covers inequalities for expectations, such as the Cauchy-Schwartz and Jensen's inequalities, which relate the expected values of functions of random variables. These inequalities are foundational tools in statistical theory and provide methods for establishing confidence intervals and analyzing convergence.

Uploaded by

jefferyhe24
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views4 pages

Ch4 Handout

Chapter 4 discusses various probability inequalities, including Markov's, Chebyshev's, and Hoeffding's inequalities, which are essential for bounding quantities in statistics. It also covers inequalities for expectations, such as the Cauchy-Schwartz and Jensen's inequalities, which relate the expected values of functions of random variables. These inequalities are foundational tools in statistical theory and provide methods for establishing confidence intervals and analyzing convergence.

Uploaded by

jefferyhe24
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Outline

Chapter 4. Inequalities

Lecturer: Yanxi Hou


1 Probability Inequalities
School of Data Science, Fudan University

2 Inequalities For Expectations

Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 1 / 13 Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 2 / 13

4.1 Probability Inequalities 4.1 Probability Inequalities

Inequalities are useful for bounding quantities that might be hard Theorem 2 (Chebyshev’s inequality)
to compute.
Let µ = E(X ) and σ 2 = V(X ). Then,
They will also be used in the theory of convergence.
σ2 1
P(|X − µ| ≥ t) ≤ and P(|Z | ≥ k) ≤
Theorem 1 (Markov’s inequality) t2 k2
Let X be a non-negative random variable and suppose that E(X ) where Z = (X − µ)/σ. In particular, P(|Z | > 2) ≤ 1/4 and
exists. For any t > 0, P(|Z | > 3) ≤ 1/9.
E(X )
P(X > t) ≤ .
t

Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 3 / 13 Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 4 / 13
4.1 Probability Inequalities 4.1 Probability Inequalities

Hoeffding’s inequality is similar in spirit to Markov’s inequality but it


Example 1
is a sharper inequality. We present the result here in two parts.
Suppose we test a prediction method, a neural net for example, on a
set of n new test cases. Theorem 3 (Hoeffding’s Inequality)
Let Xi = 1 if the predictor is wrong; Let Y1 , ..., Yn be independent observations such that E(Yi ) = 0 and
Let Xi = 0 if the predictor is right. ai ≤ Yi ≤ bi . Let  > 0. Then, for any t > 0,
Then X n = n−1 ni=1 Xi is the observed error rate. Each Xi may be
P
n n
!
2 2
X Y
−t
regarded as a Bernoulli with unknown mean p. How likely is X n to not P Yi ≥  ≤ e e t (bi −ai ) /8 .
be within  of p? i=1 i=1

Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 5 / 13 Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 6 / 13

4.1 Probability Inequalities 4.1 Probability Inequalities

Theorem 4 Example 2
Let X1 , ..., Xn ∼ Bernoulli(p). Then, for any  > 0, Let X1 , ..., Xn ∼ Bernoulli(p) and n = 100,  = 0.2. Compute
2
P(|X n − p| > ) through
P(|X n − p| > ) ≤ 2e −2n Chebyshev’s inequality;
−1
Pn Hoeffding’s inequality.
where X n = n i=1 Xi .

Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 7 / 13 Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 8 / 13
4.1 Probability Inequalities 4.1 Probability Inequalities

Hoeffding’s inequality provide a simple way to create a confidence The following inequality is useful for bounding probability statements
interval for a binomial parameter p. Fix α > 0 and let about Normal random variables.
s
1
 
2 Theorem 5 (Mill’s Inequality)
n = log
2n α Let Z ∼ N(0, 1). Then,
2 /2
r
By Hoeffding’s inequality, 2 e −t
P(|Z | > t) ≤ .
−2n2n π t
P(|X n − p| > n ) ≤ 2e = α,

Hence we call C = (X n − n , X n + n ) a 1 − α confidence interval,


since,
P(p ∈
/ C ) = P(|X n − p| > n ) ≤ α.
P(p ∈ C ) ≥ 1 − α, the random interval C traps the true
parameter p with probability 1 − α;

More detail about confidence interval can be seen in Chapter 6


Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 9 / 13 Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 10 / 13

4.2 Inequalities For Expectations 4.2 Inequalities For Expectations

Theorem 6 (Cauchy-Schwartz inequality)


Recall that a function g is convex if for each x, y and each
If X and Y have finite variances then α ∈ [0, 1],
q
E|XY | ≤ E(X 2 )E(Y 2 ). g (αx + (1 − α)y ) ≤ αg (x) + (1 − α)g (y ).

We note that,
00
If g is twice differentiable and g (x) ≥ 0 for all x, then g is
convex.
if g is convex, then g lies above any tangent line of g .
A function g is concave if −g is convex.
Examples of convex functions are g (x) = x 2 and g (x) = e x .
Examples of concave functions are g (x) = −x 2 and g (x) = log x.

Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 11 / 13 Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 12 / 13
4.2 Inequalities For Expectations

Theorem 7 (Jensen’s inequality)


If g is convex, then
E(g (X )) ≥ g (E(X )).
If g is concave, then

E(g (X )) ≤ g (E(X )).

From Jensen’s inequality we see that


E(X 2 ) ≥ (E(X ))2 ,
E(1/X ) ≥ 1/E(X ) if X is positive,
E(log X ) ≤ log E(X ) since log is concave.

Yanxi Hou (SDS, FDU) All of Statistics (Larry Wasserman) 13 / 13

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy