0% found this document useful (0 votes)
14 views5 pages

Autocorrelation Notes For VIVA

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views5 pages

Autocorrelation Notes For VIVA

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Autocorrelation

Definition:
Covariance between two adjacent disturbance terms is zero

COV ( ε i , ε j )=0 , ∀ i , ji ≠ j
i.e, The correlation between the successive disturbance is zero.
If covariance is not zero, then have autocorrelation
The variance of disturbance term remain constant though the successive disturbance terms are
correlated , then such problem is termed as the problem of autocorrelation.
Autocorrelation is a time-series phenomenon .
1st-order autocorrelation implies that neighboring observations are correlated
– the observations aren’t independent draws from the sample

Positive Autocorrelation
Negative Autocorrelation

Examples:
1. The city of St. Paul has a spike of crime and so they hire additional police. The following
year, they found that the crime rate decreased significantly. Amazingly, the city of
Minneapolis, which had not adjusted its police force, finds that they have an increase in
the crime rate over the same period.
2. If we are predicting the growth of stock dividends, an overestimate in one year is likely to
lead to overestimate in succeeding years.
3. Athletes competing against exceptionally good or bad teams.This is especially evident in
baseball because teams play each other 3-4 times in a row.

• Main idea: autocorrelation affects the efficiency of estimators.

Reasons of Autocorrelations:
Autocorrelation occurs through one of the following avenues:
1. Inertia/Time to Adjust
1. This often occurs in Macro, time series data. The US interest rate unexpectedly
increases and so there is an associated change in exchange rates with other
countries. Reaching a new equilibrium could take some time
2. With regards to unemployment this was called hysteresis: this means that certain
sections of society who are prone to unemployment.
2. Cobweb phenomenon
1. agents respond to information with lags to
2. this is usually related to agricultural markets
3. Prolonged Influences
1. This is again a Macro, time series issue dealing with economic shocks. It is now
expected that the US interest rate will increase. The associated exchange rates
will slowly adjust up-until the announcement by the Federal Reserve and may
overshoot the equilibrium.
4. Data Smoothing/Manipulation
1. Using functions to smooth data will bring autocorrelation into the disturbance
terms
2. example: constructing annual information based on quarterly data
5. Misspecification
1. A regression will often show signs of autocorrelation when there are omitted
variables. Because the missing independent variable now exists in the
disturbance term, we get a disturbance term that looks like: ❑❑ ε t= β2 X 2 +ut
when the correct specification is Y t =β 0 + β 1 X 1 + β 2 X 2+ ut

Source of autocorrelation:
Some of the possible reasons for the introduction of autocorrelation in the data are as follows:
1. Carryover of effect, at least in part, is an important source of autocorrelation. For
example, the monthly data on expenditure on the household is influenced by the
expenditure of the preceding month. The autocorrelation is present in cross-section
data as well as time-series data. In the cross-section data, the neighbouring units tend
to be similar with respect to the characteristic under study.
2. In time-series data, time is the factor that produces autocorrelation. Whenever some
ordering of sampling units is present, the autocorrelation may arise.
3. Another source of autocorrelation is the effect of deletion of some variables. In
regression modeling,it is not possible to include all the variables in the model. There can
be various reasons for this, e.g., some variable may be qualitative, sometimes direct
observations may not be available on the variable etc. The joint effect of such deleted
variables gives rise to autocorrelation in the data.
4. The misspecification of the form of relationship can also introduce autocorrelation in
the data. It is
5. assumed that the form of relationship between study and explanatory variables is
linear. If there are log or exponential terms present in the model so that the linearity of
the model is questionable, then this also gives rise to autocorrelation in the data.
6. The difference between the observed and true values of the variable is called
measurement error orerrors–in-variable. The presence of measurement errors on the
dependent variable may also introduce the autocorrelation in the data.
Consequences of Autocorrelation:
The main problem with autocorrelation is that it may make a model look better than it actually
is.
List of consequences

1. The model is still linear and unbiased if autocorrelation exists: Y t =β 0 + β 1 X 1 + β 2 X 2+ ut

2. Coefficients are still unbiased E E ( ε t )=0 , cov ( X t , ut ) =0


^ β^ is increased, by the presence of autocorrelations.
3. True variance of ❑

4. Estimated variance of sessssssssss ()se ( ^β ) is smaller due to autocorrelation (biased


downward).
5. A decrease in and an increase of the t-statistics; this results in the estimator looking
more accurate than it actually is.
6. R² becomes inflated.
7. Estimates are linear and unbiased
8. Estimates are not efficient. We no longer have minimum variance
9. Estimated variances are biased either positively or negatively
10. Unreliable t and F test results
11. Computed variances and standard errors for predictions are biased
All of these problems result in hypothesis tests becoming invalid
Detecting autocorrelation:
Testing for Autocorrelation
I. A plot of residuals. Plot et against t and look for clusters of successive residuals on one
side of the zero line. You can also try adding a Lowess line, as in the image below.
II. A Durbin-Watson test.
III. A Lagrange Multiplier Test.
IV. Ljung Box Test.
V. A correlogram. A pattern in the results is an indication for autocorrelation. Any values
above zero should be looked at with suspicion.
VI. The Moran’s I statistic, which is similar to a correlation coefficient.

Remedial measures of Autocorrelation:


 GLS when p is known( Cochrane-Orcutt two-step
procedure)
 Prais-Winsten transformation
 When p is unknown (FGLS)
 Use DW test to estimate p and transform as above
 First difference method
 Newey West SE correction method (HAC) large
sample test

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy