100% found this document useful (1 vote)
114 views22 pages

8.estimation I - 530

The document discusses point estimation in statistical inference. It introduces point estimation as estimating an unknown parameter using a single value. It describes parameter space, statistics, and estimators. It then covers two common methods of point estimation: method of moments estimation and maximum likelihood estimation. Method of moments estimation finds estimates by equating population moments to sample moments. Maximum likelihood estimation finds the parameter values that maximize the likelihood function given the sample data. The document provides examples and discusses properties and advantages/disadvantages of each method.

Uploaded by

Bharti
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
114 views22 pages

8.estimation I - 530

The document discusses point estimation in statistical inference. It introduces point estimation as estimating an unknown parameter using a single value. It describes parameter space, statistics, and estimators. It then covers two common methods of point estimation: method of moments estimation and maximum likelihood estimation. Method of moments estimation finds estimates by equating population moments to sample moments. Maximum likelihood estimation finds the parameter values that maximize the likelihood function given the sample data. The document provides examples and discusses properties and advantages/disadvantages of each method.

Uploaded by

Bharti
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 22

STATISTICAL INFERENCE

PART I
POINT ESTIMATION

1
STATISTICAL INFERENCE
• Determining certain unknown properties of
a probability distribution on the basis of a
sample (usually, a r.s.) obtained from that
distribution
̂
Point Estimation: ̂  5 

Interval Estimation: 3    8 ( ) 

Hypothesis Testing: H 0 :   5
H1 :   5
2
STATISTICAL INFERENCE
• Parameter Space ( or ): The set of all
possible values of an unknown parameter,
; .
• A pdf with unknown parameter: f(x; ), .
• Estimation: Where in ,  is likely to be?

{ f(x; ),  } The family of pdfs

3
STATISTICAL INFERENCE
• Statistic: A function of rvs (usually a
sample rvs in an estimation) which does not
contain any unknown parameters.
2
X , S , etc
• Estimator of an unknown parameter :ˆ
A statistic used for estimating .
ˆ : estimator  U  X 1 , X 2 , , X n 
X : Estimator
An observed value

4
x : Estimate : A particular value of an estimator
POINT ESTIMATION
• θ: a parameter of interest; unknown
• Goal: Find good estimator(s) for θ or its
function g(θ).

5
METHODS OF ESTIMATION

Method of Moments Estimation,


Maximum Likelihood Estimation

6
METHOD OF MOMENTS
ESTIMATION (MME)
• Let X1, X2,…,Xn be a r.s. from a population
with pmf or pdf f(x;1, 2,…, k). The MMEs
are found by equating the first k population
moments to corresponding sample moments
and solving the resulting system of
equations.
Population Moments Sample Moments
1n k
k  E  X k  Mk   Xi
n i1
7
METHOD OF MOMENTS
ESTIMATION (MME)
1  M 1 2  M 2 3  M 3
so on…
1 1
E  X    X i E  X    X i3
n n
1n
E  X    Xi
2 2 3

n i1 n i1 n i1

Continue this until there are enough equations


to solve for the unknown parameters.

8
EXAMPLES
• Let X~Exp().
• For a r.s of size n, find the MME of .
• For the following sample (assuming it is
from Exp()), find the estimate of :
11.37, 3, 0.15, 4.27, 2.56, 0.59.

9
EXAMPLES
• Let X~N(μ,σ²). For a r.s of size n, find the
MMEs of μ and σ².
• For the following sample (assuming it is
from N(μ,σ²)), find the estimates of μ and
σ²:
4.93, 6.82, 3.12, 7.57, 3.04, 4.98, 4.62,
4.84, 2.95, 4.22

10
DRAWBACKS OF MMES

• Although sometimes parameters are


positive valued, MMEs can be negative.

• If moments does not exist, we cannot find


MMEs.

11
MAXIMUM LIKELIHOOD
ESTIMATION (MLE)
• Let X1, X2,…,Xn be a r.s. from a population
with pmf or pdf f(x;1, 2,…, k), the
likelihood function is defined by
L 1 , 2 , , k x1 , x2 , , xn   f  x1 , x2 , , xn ;1 , 2 , , k 

  f  xi 1 , 2 , , k 
n

i 1

  f  xi ;1 , 2 , , k 
n

i 1

12
MAXIMUM LIKELIHOOD
ESTIMATION (MLE)
• For each sample point (x1,…,xn), let
ˆ1  x1 ,..., xn  , ,ˆk  x1 ,..., xn 
be a parameter value at which
L(1,…, k| x1,…,xn) attains its maximum as a
function of (1,…, k), with (x1,…,xn) held fixed. A
maximum likelihood estimator (MLE) of
parameters (1,…, k) based on a sample (X1,…,Xn)
is
ˆ1  x1 ,..., xn  , ,ˆk  x1 ,..., xn 
• The MLE is the parameter point for which the
observed sample is most likely. 13
EXAMPLES
• Let X~Bin(n,p), where both n and p are unknown. One
observation on X is available, and it is known that n is
either 2 or 3 and p=1/2 or 1/3. Our objective is to
estimate the pair (n,p).
x (2,1/2) (2,1/3) (3,1/2) (3,1/3) Max. Prob.
0 1/4 4/9 1/8 8/27 4/9
1 1/2 4/9 3/8 12/27 1/2

2 1/4 1/9 3/8 6/27 3/8


3 0 0 1/8 1/27 1/8

(2,1/ 3) if x  0
(2,1/ 2) if x  1

   
ˆ
n , ˆ
p x 
(3,1/ 2) if x  2
(3,1/ 2) if x  3 14
MAXIMUM LIKELIHOOD
ESTIMATION (MLE)
• It is usually convenient to work with the logarithm
of the likelihood function.
• Suppose that f(x;1, 2,…, k) is a positive,
differentiable function of 1, 2,…, k. If a
supremum ˆ1 ,ˆ2 , ,ˆk exists, it must satisfy the
likelihood equations
 ln L 1 , 2 , , k ; x1 , x2 , , xk 
 0, j  1, 2,..., k
 j
• MLE occurring at boundary of  cannot be
obtained by differentiation. So, use inspection. 15
MLE
• Moreover, you need to check that you are
in fact maximizing the log-likelihood (or
likelihood) by checking that the second
derivative is negative.

16
EXAMPLES
1. X~Exp(), >0. For a r.s of size n, find the
MLE of .

17
EXAMPLES
2. X~N(,2). For a r.s. of size n, find the
MLEs of  and 2.

18
EXAMPLES
3. X~Uniform(0,), >0. For a r.s of size n,
find the MLE of .

19
INVARIANCE PROPERTY OF THE
MLE
• If ˆ is the MLE of , then for any function
(), the MLE of () is  ˆ .

Example: X~N(,2). For a r.s. of size n, the


MLE of  is X . By the invariance property
of MLE, the MLE of 2 is X 2 .

20
ADVANTAGES OF MLE
• Often yields good estimates, especially for
large sample size.
• Invariance property of MLEs
• Asymptotic distribution of MLE is Normal.
• Most widely used estimation technique.
• Usually they are consistent estimators. [will
define consistency later]

21
DISADVANTAGES OF MLE
• Requires that the pdf or pmf is known
except the value of parameters.
• MLE may not exist or may not be unique.
• MLE may not be obtained explicitly
(numerical or search methods may be
required.). It is sensitive to the choice of
starting values when using numerical
estimation.
• MLEs can be heavily biased for small
samples.
22

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy